|
Am 08.10.2025 um 20:23 schrieb Don Brown via MIDRANGE-L <midrange-l@xxxxxxxxxxxxxxxxxx>:
Hi Daniel
Thanks for the suggestion - I tried it in ACS RunSQL and while there was
no error nothing was returned.
I do not understand ... yet, the columns(...) and path requirements.
Thanks
Don
Don Brown
Senior Consultant
[1]OneTeam IT Pty Ltd
P: 1300 088 400
-----Original Message-----
From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of
Daniel Gross
Sent: Thursday, 9 October 2025 9:04 AM
To: midrange-l@xxxxxxxxxxxxxxxxxx
Subject: Re: Data-INTO with double array fails but is valid JSON
I know that this is a RPG / DATA-INTO question, but have you thought about
doing it with SQL / JSON_TABLE?
JSON_TABLE has absolutely no problem with un-named JSON arrays, as you can
simply write the JSON path to the elements, to get them.
select *
from json_table(
'...your-json-document...',
'lax $'
columns(
error_details varchar(255) path '$.error_details[*][*]'
)
) as json_doc;
And you get a row by row list of the contents of the "error_details"
elements. In your example it would be only 1 line - but it could also be
multiple lines - you can also retrieve the index of the outer (named) and
the inner (un-named) array.
DATA-INTO is good, don't get me wrong - but JSON_TABLE is far more
flexible.
I know this was - once again - one of my "why don't you use SQL" answers
... sorry for that.
Kind regards,
Daniel
BradAm 08.10.2025 um 18:06 schrieb Don Brown via MIDRANGE-L<midrange-l@xxxxxxxxxxxxxxxxxx>:
Yeah Brad,
I agree but we have no control over this.
I know I have see that exact issue discussed here but do you think I can
find it ... Arrgghhh!!!
Don Brown
Senior Consultant
[1]OneTeam IT Pty Ltd
P: 1300 088 400
-----Original Message-----
From: MIDRANGE-L <midrange-l-bounces@xxxxxxxxxxxxxxxxxx> On Behalf Of
Stone<midrange-l@xxxxxxxxxxxxxxxxxx>
Sent: Wednesday, 8 October 2025 10:35 PM
To: Midrange Systems Technical Discussion
Subject: Re: Data-INTO with double array fails but is valid JSONbelow.
I hope you figure it out. That error details array is foobar. It may be
valid but it's not "normal".
On Wed, Oct 8, 2025 at 1:1 AM Don Brown via MIDRANGE-L <
midrange-l@xxxxxxxxxxxxxxxxxx> wrote:
Hi
I am sure I have seen someone else with this error but searching the
archives has not found the incident.
The JSON has a double [[ ]] indicating two arrays. If I remove one
set (leaving [ ] ) it processes correctly with the data structure
subscribe,match
So, we have an unnamed array inside an array, how do we build the
data structure for that ?
We have this JSON
{"error_details":[
["34515897: meter_off_time 2025-10-06 01:57:06+00:00 is equal to or
before meter_on_time '2025-10-06 02:45:04+00:00'"] ],
"errored_booking_numbers":["34515897"],
"ignored_booking_numbers":[],
"saved_booking_numbers":["34527032","34526927","34526855","34521442"]
, "summary":{"error":1,"ignored_duplicate":0,"saved":4}}
I have checked and the JSON is valid.
This is the data structure.
We did originally have the "error_detail" in the structure but had
the same problem and removed thinking it may work.
dcl-ds SATSSData_rUploadData qualified template inz; dcl-ds rData
likeds(DKT_GeneralReturnData); dcl-ds summary; saved zoned(5)
inz(*zeros); ignored zoned(5) inz(*zeros); ignored_duplicate zoned(5)
inz(*zeros); error zoned(5) inz(*zeros); end-ds summary;
count_saved_booking_numbers zoned(5) inz(*zeros);
saved_booking_numbers varchar(20) inz('') dim(C_SATSS_MAXDATA);
count_ignored_booking_numbers zoned(5) inz(*zeros);
ignored_booking_numbers varchar(20) inz('') dim(C_SATSS_MAXDATA);
count_errored_booking_numbers zoned(5) inz(*zeros);
errored_booking_numbers varchar(20) inz('') dim(C_SATSS_MAXDATA);
end-ds;
RPG Code
d rtnData DS likeds(SATSSData_rUploadData)
MONITOR;
Data-into rtnData %Data( %trim(rData.File) :
'doc=file case=convert allowmissing=yes allowextra=yes +
countprefix=count_ trim=all') %Parser( 'YAJLINTO' ); on-error;
wrkMessage = 'Fail to Convert ' + %trim(rData.File);
UpdateLog(wrkMessage); exsr Return; endmon;
Error Message
Data-Into Error message:
Message ID . . . . . . : RNX0356
Date sent . . . . . . : 08/10/25 Time sent . . . . . . :
14:33:00
Message . . . . : The document for the DATA-INTO operation does not
the RPG variable; reason code 5.countprefix=count_
Cause . . . . . : While parsing a document for the DATA-INTO
operation, the parser found that the document does not correspond to
RPG variable "rtndata"
and the options do not allow for this. The reason code is 5. The
exact subfield for which the error was detected is "rtndata". The
options are "doc=file case=convert allowmissing=yes allowextra=yes
trim=all". The document name isa
/HOME/SATSS/DOWNLOAD/AIT_20251008_001644918_SATSS_UPLOADDATA.json; *N
indicates that the document is not an external file. The parser is
'YAJLINTO'. *N indicates that the parser is a procedure pointer.
Recovery . . . : Contact the person responsible for program
maintenance to determine the cause of the problem.
Technical description . . . . . . . . : Reason codes and their
meanings are as follows:
1. The specified path to the name was not found in the document.
2. The document contains too few array elements for array subfields
of a data structure.
3. The document contains too many array elements for array subfields
of
data structure.--
4. The document is missing information to match subfields.
5. The document contains extra names that do not match subfields.
6. The document contains text content within the content for the
subfields of a data structure.
7. The document contains subfield items for RPG scalar fields,
subfields or
Any assistance appreciated.
Thanks
Don
Brisbane - Sydney - Melbourne
Don Brown
Senior Consultant
P: 1300 088 400
DISCLAIMER. Before opening any attachments, check them for viruses
and defects. This email and its attachments may contain confidential
information. If you are not the intended recipient, please do not
read, distribute or copy this email or its attachments but notify
sender and delete it. Any views expressed in this email are those of
the individual sender
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L)
mailing list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: [2][2]https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
[3][3]https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription
related questions.
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To
unsubscribe, or change list options,questions.
visit: [4][4]https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
[5][5]https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related
questions.
--
Message protected by MailGuard: e-mail anti-virus, anti-spam and content
filtering.
[6][6]https://www.mailguard.com.au
References
Visible links
1. [7]https://www.oneteamit.com.au/
2. [8]https://lists.midrange.com/mailman/listinfo/midrange-l
3. [9]https://archive.midrange.com/midrange-l.
4. [10]https://lists.midrange.com/mailman/listinfo/midrange-l
5. [11]https://archive.midrange.com/midrange-l.
6. [12]https://www.mailguard.com.au/
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To
subscribe, unsubscribe, or change list options,
visit: [13]https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
[14]https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing
list To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx To subscribe,
unsubscribe, or change list options,
visit: [15]https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives at
[16]https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related
questions.
--
Message protected by MailGuard: e-mail anti-virus, anti-spam and content
filtering.
[17]https://www.mailguard.com.au
References
Visible links
1. https://www.oneteamit.com.au/
2. https://lists.midrange.com/mailman/listinfo/midrange-l
3. https://archive.midrange.com/midrange-l.
4. https://lists.midrange.com/mailman/listinfo/midrange-l
5. https://archive.midrange.com/midrange-l.
6. https://www.mailguard.com.au/
7. https://www.oneteamit.com.au/
8. https://lists.midrange.com/mailman/listinfo/midrange-l
9. https://archive.midrange.com/midrange-l.
10. https://lists.midrange.com/mailman/listinfo/midrange-l
11. https://archive.midrange.com/midrange-l.
12. https://www.mailguard.com.au/
13. https://lists.midrange.com/mailman/listinfo/midrange-l
14. https://archive.midrange.com/midrange-l.
15. https://lists.midrange.com/mailman/listinfo/midrange-l
16. https://archive.midrange.com/midrange-l.
17. https://www.mailguard.com.au/
--
This is the Midrange Systems Technical Discussion (MIDRANGE-L) mailing list
To post a message email: MIDRANGE-L@xxxxxxxxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: https://lists.midrange.com/mailman/listinfo/midrange-l
or email: MIDRANGE-L-request@xxxxxxxxxxxxxxxxxx
Before posting, please take a moment to review the archives
at https://archive.midrange.com/midrange-l.
Please contact support@xxxxxxxxxxxxxxxxxxxx for any subscription related questions.
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.