|
Santa In this post & earlier ones, I have identified several files that you might want to study the contents of ... what is this or that field used for in your environment. When studying a file that I am unfamilar with, I like to do a 3 step process. 1. DSPFD FLT F4 *PRINT whatever the file is ... IBM layout reference 2. RUNQRY FLT F4 *YES record selection on bottom 3. Another side session simulataneously WRKQRY looking at the file layout WRKQRY supplements DSPFD for rapid navigation to make sure I am on the right field name ... typically in RUNQRY I am looking at some column of data "what the heck is this" I ask myself, about the population I am seeing ... I use RUNQRY F19 F20 sideways to count ... this is field # 25 I am looking at ... I use WRKQRY F19 vertical to count ... field # 25 is called _____ whatever & I page through the DSPFD print out to jot down some notes there. I speculate about the relatationship of field # 25 data to various other fields & the reason for RUNQRY selection *YES is F12 back to selection screen multiple times to do selection criteria to check out various patterns of data population that I am seeing. When you gave your extreme example, were you looking at ALL THE DATES in ALL THE FILES ... FSO FOD FMA ... in my experience the dates are NOT identical but are backward scheduled on multiple operations. FSO says when the shop order needs to be completed, but FOD has start dates that are scattered over a period of several days, depending on the routings spelling out the rate at which the work can be done & the volume that needs to be done. FSO has one record per shop order FOD has one type-1 record per operation FMA has one record per material consumed by the shop order Remember that we are not using negative days in our company, but we are in reality. What Time Basis Code are you using? Do you understand Time Basis Codes in the routings? For example, one says that each unit can be made in 0.0001 time period, another says that x,xxxx units can be made in 1 time period. Are your work centers setup for 8 hour days, or have you made allowances for people getting potty breaks & other reasons why when we pay people for an 8 hour day they really are working a little less than that. Your corporate goals may not be quite the same as ours, and my role in this also a bit different. Our goals are to make a profit & deliver product to customer when we promised delivery & to have zero tolerance for quality problems in the delivery. Now this means that there is great interest in such topics as cost variance ... we compare price / standard cost / actual cost. My role is to make the process easier on the people working with the system. In our labor reporting we seek to accurately capture the hours it took to make the product, the material consumed, the scrap, the down time. We have also modified the process to reduce the work load needed to enter labor tickets, to reduce the risk of human error, and to add capture of some information to other data base elements ... mainly we track tools neccessary to the production process, such as molds, then have later reports showing scrap rates by tool-id which tells factory management which of those tools in greatest need of preventative maintenance. Without modifying CST900 itself, I have made modifications to the tracking process. Before CST900 runs, I have a query recreate a summary query/400 work file that contains some FSO information on the orders coded for purge & what the actual cost is on those items before the purge. Then after the purge, I run reports on the new actual cost variances on those items, and by how much the actual cost has changed thanks to the CST900 run. This is all in a jobstream CL in which one is before CST900 & another after CST900 in which we periodically adding new clue reports to the collection. Also before CST900 we run a program which is a clone of CST270 using a different logical over FOD to eliminate the garbage from the report, that SSAX says is supposed to be there (treating type-3 lines as if they were type-1). The queries tell us which items in recently closed production had bad cost variances. We run a search on CST270 clone to see the whole WIP cost story on those items ... invariably 80% or more of the variance is in one subcomponent. We search CST270 clone for that subcomponent. We keep CST270 clones for several past CST900 runs. 90% of our cost variances, analysed by this means, can be traced to labor reporting in which the actual rates on some subcomponent just came nowhere close to standard. If you want to take a look at this kind of information, without doing all the gyrations that we have indulged in so far ... FLT has labor history ... it has lots of information about RATES & very little about costs ... how much work was done in what time period & how does that compare to the standard ... the cost in our case comes from the CEM file on the clock # used in the labor reporting ITH has actual cost & standard cost of the inventory transaction ... we use 3 kinds ... receipts of completed materials, issues of consumed materials, consumption of materials when rejects. There is a labor ticket # assigned by JIT600 & that goes into both the FLT transaction & corresponding ITH transaction on same reporting, so you can link the two by the ticket # & get the whole story on the rates & costs associated as of the time the labor reporting went into the system, and you can also get the date this was actually reported to the system. The date is important because when CST900 updates costs, it uses the latest costs of the material being consumed, including costs just updated by the previous shop order purged. Thus of interest is to do a report on a particular item's actual costs by date time processed from labor thru ITH to show how the actual cost jumps periodically due to CST900 impact. MacWheel99@aol.com (Alister Wm Macintyre) (Al Mac) BPCS 405 CD Manager / Programmer @ Global Wire Technologies Incorporated http://www.globalwiretechnologies.com = new name same quality wire engineering company: fax # 812-424-6838
As an Amazon Associate we earn from qualifying purchases.
This mailing list archive is Copyright 1997-2025 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].
Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.