From time to time, there is a need to modify information systems due to changes in legislation (like SOX), standards, currency change (like the euro), and more. These types of changes have a substantial impact on many components of an information system and therefore contain a high risk factor.
4. BluePhoenix Mainframe Architecture Inventory Reports Generation of Tools Conversion Unit Test System Test Implementation Repository Enhancement Repository Files Specific Cluster Metadata Libraries Generation Parameters Libraries Converted Libraries Implementation Parameters
39. ITD – Incremental Running IT Discovery Collection Scripts Control Datasets Catalog IT Discovery Repository C-Scan Engine DB Definitions Changes, Additions and Deletions ONLY Libraries
Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A walk through the conversion process including a case study you will do on your own is WED and THURS AM The Summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.
We sell service, not technology It is the combination of our people, our methodology and our tools that we provide that delivers our service
Describe the flow process of all 6 phases A questionnaire covers the management, development and maintenance processes, the existing systems and applications, the hardware, data and system software, the current practices, and IS principles. A survey and assessment is completed to identify the Year 2000 date affected components (software, hardware, procedures, databases, etc.). Clusters, data bridges and interfaces are also identified for conversion. The conversion process includes multiple phases in an iterative process, fine-tuning the conversion controls until compilable code is achieved. Intra-cluster tests (unit and regression) are performed. Intra-cluster tests are completed on all components within the cluster. Acceptance testing is a client-driven testing process that demonstrates the application’s ability to meet the acceptance criteria previously defined with the client.
Benefits: Maximize automation of the conversion process. Minimize interference with Production Systems Maintenance. Minimize code freezing period. Progressive conversion of logically linked subsystems (clusters). Conversion process transparent to end user. Capabilities: Produces a database of the organization’s software inventory. Produces a database of all date fields and their cross references. Provides a large range of software inventory cross reference reports. Provides computerized templates that control the conversion process. Provides automated update capabilities that support db and application logic changes. Toolbox automated analysis, conversion, management, and control services simplifies the conversion process, significantly minimizing risks usually involved with such a large project. Consistently automates conversions minimizing human mistakes that are difficult to detect. Enables the correction of errors by applying changes across multiple applications. Develops control parameters to use as input to perform the actual conversion. Changes to program do not affect the performance of conversion. Customer retains control over source code, thereby minimizing freeze time.
Benefits: Maximize automation of the conversion process. Minimize interference with Production Systems Maintenance. Minimize code freezing period. Progressive conversion of logically linked subsystems (clusters). Conversion process transparent to end user. Capabilities: Produces a database of the organization’s software inventory. Produces a database of all date fields and their cross references. Provides a large range of software inventory cross reference reports. Provides computerized templates that control the conversion process. Provides automated update capabilities that support db and application logic changes. Toolbox automated analysis, conversion, management, and control services simplifies the conversion process, significantly minimizing risks usually involved with such a large project. Consistently automates conversions minimizing human mistakes that are difficult to detect. Enables the correction of errors by applying changes across multiple applications. Develops control parameters to use as input to perform the actual conversion. Changes to program do not affect the performance of conversion. Customer retains control over source code, thereby minimizing freeze time.
The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A Walk Through the conversion process including a case study you will do on your own is WED and THURS AM The Summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.
INIT - Example: INIT (BUILDW(10000' ')) to initialize work area with blanks. TERM - to calculate and write summary values of fields. INITMEM - example: INITMEM(BUILD(1:$$MEMBER)) to save member name. TERMMEM - Same as TERM but the summaries are per member. PROCESS - OUTREC - process a record from input. Give example on the board!!! APPEND - write example of a report!!! HEADER - create information for page header lines. TRAILER - Same as above for trailer lines. KEYS - example: KEYS(1,8,HEADER('DETAILS FOR':1,8,/))
Examples should be prepared for those variables that need them.
Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A Walk Through the conversion process including a case study you will do on your own is WED and THURS AM The summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.
Survey questionnaires are distributed to the client: Site Preliminary Questionnaire One of the first contacts with the client will be to initiate a request for general information about the client’s environment. This will be done through a site Preliminary Questionnaire. The purpose of the questionnaire is to understand the complexity and quantities of the most important components of the client’s environment. The questionnaire also identifies the main programming languages, the main DBMSs, system monitors, and naming conventions. Use the document to identify components not yet supported by the Toolbox. System Questionnaires Each site will have many applications, projects or systems being processed. A System Questionnaire should be distributed to each client resource responsible for an application, project or system. The purpose of the questionnaire is to collect information about every active application system or project that exists on site and is valid for the conversion process. This will be done by obtaining the naming conventions per system and a list of libraries, files, databases and I/O modules. It is needed for setting up the tools for performing the survey and for clustering the systems. The information from the system questionnaires will also be used for learning the client’s environment. The information will be compared with the actual components found through scanning the environment using IT Discovery. Reports are included with IT Discovery for reporting discrepancies in the client information.
This illustrates the process flow during Survey and Assessment using IT Discovery.
Properties of Objects Each object in AppBuilder has a set of properties or attributes that describe it. Some properties are common to all object types (for example Name and System Id), but for the most part, each object type has a different set of properties. In other words, where a Field might have a data format and length, a Rule would not; it would have an execution environment. At this early stage of your AppBuilder learning, you are not expected to know all the properties of all the object types you will encounter, they will become apparent as you progress through the course. All you need to know at the moment is that when you create an object the properties are set to default values, which you may well have to change. To see the properties for any object, press Alt + Enter from the hierarchy diagram.
The goal of this process is to create a relationship between a program and date fields. The process is primarily automated through IT Discovery jobs and includes the following steps: Repository Files Merged This process involves merging the DBDTSCAN files. The language oriented DBDTSCAN files will be merged into a common global repository file. Repository Files Populated The final process of building the repository involves populating the repository file. In this step the unnormalized DBDTSCAN dataset is exploded to many additional dataset such as DBCOPY, DBCALL, DBCALLED, etc. Duplicate information is deleted and information about the same entity is merged from various records. Repository Transactions Completed This process will complete the transaction information for the repository. The DBDTSCAN repository includes information about relationships between programs and transactions. This information is merged into DBTRAN. Up to this point in the process, the transaction repository includes information about the relations between transactions and programs. The objective of this step is to complete the information regarding the relations between on-line programs and files (DDnames DSnames or DBnames).
What is an Object? All objects have the following five properties: General Properties Audit - who, when, where etc... Remote Audit - when created on the Enterprise repository. Text - description of the object Keywords - help with searching for objects Some objects such as RULES also have source code associated with them.
Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A Walk Through the conversion process including a case study you will do on your own is WED and THURS AM The summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.
The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components.. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
The first step of scanning the environment is to create a dataset of recent log activities. Depending on the client’s archiving procedures, up to 15 months of the system log files may be processed. The purpose is to identify the active jobs, programs and on-line transactions in order to reduce the conversion repository to only active components. In this step the DBLOG dataset is created out of SMF and other monitor log files, such as Tmon, Omegamon, etc. The results are a list of jobs, programs and transactions including statistics of how often each has been used. The jobs should be tailored according to the site’s archiving procedure and system monitor type. In the case of a third party’s log manager such as MXG, a job should be tailored according to their reports. The based assumption is that these reports were verified by the client and found to be accurate.
Methodology is SUN AM C-Scan language bits and bytes is SUN PM The major special purpose “exit” is MON AM; this includes exercise time The special purpose files that we use to analyze and convert systems, with their corresponding reports is MON PM Other special purpose “exits” is TUES AM; this includes exercise time, which will extend into the afternoon A Walk Through the conversion process including a case study you will do on your own is WED and THURS AM The Summary is THURS PM No homework, but you’ll find that reading through the reference material and reviewing your class notes will be helpful.