Laurence -

You would want to SAVCHGOBJ OBJJRN(*YES) plus other pertinent parms on the
command. REFDATE(*SAVLIB) indicates that you want to save any changed
objects since the last SAVLIB command against each library. I suppose you
could enter a specific date in that field (maybe you want to save all
objects changed since January 1st) but I can't come up with a scenario.

Seems like this is being made harder than it needs to be.

Thanks,

Steve McKay
(205) 585-8424
samckay1@xxxxxxxxx



On Mon, Apr 27, 2020 at 5:08 PM Laurence Chiu <lchiu7@xxxxxxxxx> wrote:

To answer my own question with comments from our IBMi personnel. Just
wanted to gain an expert view of their response.

When the OS upgrade was done between 5.4? to 7.1 apparently the systems
support vendor issued this command

https://bit.ly/2YavyZD

The Power team doesn't know exactly what this did but had the following
queries based on reading the IBMi manual.

"IBM states some objects may not be included in the saving. This needs to
be coded and tested and maybe business to signoff the risk that some
objects may not be included." They then quoted this from the manual.

The Save Changed Object (SAVCHGOBJ) command saves a copy of each changed
object or group of objects located in the same library. When *ALL is
specified for the Objects (OBJ) parameter, objects can be saved from all
user libraries or from a list of libraries. When saving to a save file,
only one library can be specified. For database files, only the changed
members are saved.

Objects changed since the specified date and time are saved with the
following exceptions:

- If OBJJRN(*NO) is specified, objects currently being journaled are not
saved, unless journaling was started after the specified date and time.
This ensures that changes made to an object before journaling starts are
not lost (because they were not journaled in a journal receiver).
- Freed objects (programs, files, journal receivers, and so forth) are
not saved.
- User-defined messages, job and output queue definitions, and logical
file definitions are saved, but the contents of those objects are not
saved. Logical file access paths are saved if ACCPTH(*YES) is specified.
The contents of a data queue can be saved by specifying *DTAQ for the
Queue
data (QDTA) parameter.


Basically what we want is the exact same items saved as would be with the
SAVOBJ command except that only the changes are saved and then restored.
That (to my un-expert view) is the command that currently is issued viz.
CMD(SAVOBJ OBJ(*ALL) would be changed to CMD(SAVCHGOBJ OBJ(*ALL)

So that still saves all objects but only the ones that were changes since
the last full save. Would not coding OBJ(*ALL) means the exceptions noted
above do not apply?

They want to test this so I guess that's valid. Just not sure how to do it
yet.


On Sat, Apr 25, 2020 at 10:16 AM Laurence Chiu <lchiu7@xxxxxxxxx> wrote:



On Fri, Apr 24, 2020 at 11:39 PM Steve McKay <samckay1@xxxxxxxxx> wrote:

Laurence -

1. If that is the actual command, the TGTRLS parameter could be
updated
to reflect the current releases of your systems (V7R1 instead of V5R4).
You aren't really hurting anything if you don't change it but if you get
an
object that requires a higher release of OS support than V5R4, you won't
be
able to save it with this particular command due to the V5R4
specification.

2. If you're happy with the process as it is - you can make the
suggested
tweaks and proceed as usual


As has been stated the commands were not changed since the source systems
went to 7.1. So during the time when the source was 7.1 the target was
earlier but no issues occurred during the time until the target was also
upgraded.


3. Yes, you have the concept - this is a differential save. SAVOBJ and
SAVCHGOBJ are both restored with the RSTOBJ command - it will restore
objects from your SAVF regardless of how they were saved. You can also
use
RSTOBJ to restore objects saved with the SAVLIB command should you
perform
a SAVLIB and only need to restore 1 or 2 objects rather than the entire
library.


Thanks I have asked the team to look at it but have to be diplomatic
since
I am consultant, not a perm employee and don't want to tell the
organisation staff how to suck eggs :-( Still the weight of opinion on
this
thread suggests that this is worth asking. Over dark fibre and GbE it was
never an issue with the 30G file taking only 20 minutes to send to the
target system.

]
4. see #3 above.


Do you have BRMS on your system? (GO LICPGM option 10 - look for
5770-BR1). If so, is it being used? You referenced BRMS in your
initial
email, which brought about a couple of responses regarding its' use in
resolving your issue. If you have BRMS, you may wish to review those
responses.

We do and use it regularly to backup the data to a ProtecTier VTL. But
the target system does not have a VTL. If the differential save works
that
will be fine for the time we need to do this. The goal is to migrate the
entire system to a separate prod and DR environment and once that is
done,
there will be no file transfers required.

Thanks,

Steve McKay
(205) 585-8424
samckay1@xxxxxxxxx


--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/midrange-l.

Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related
questions.

Help support midrange.com by shopping at amazon.com with our affiliate
link: https://amazon.midrange.com


As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.