According to Oracle Document:
You define a subject area by specifying a fact table or set of fact tables to be the central table or tables in the subject area. When a subject area is defined,
DAC performs the following process to determine the relevant tasks:
DAC identifies the dimension tables associated with the facts and adds these tables to the subject area.
DAC identifies the related tables, such as aggregates, associated with the fact or dimension tables and adds them to the subject area definition.
DAC identifies the tasks for which the dimension and fact tables listed in the two processes above are targets tables and adds these tasks into the subject area.
Tasks that DAC automatically assigns to a subject area are indicated with the Autogenerated flag (in the Tasks subtab of the Subject Areas tab).
You can inactivate a task from participating in a subject area by selecting the Inactive check box (in the Tasks subtab of the Subject Areas tab). When the Inactive check box is selected, the task remains inactive even if you reassemble the subject area.
You can also remove a task from a subject area using the Add/Remove command in the Tasks subtab of the subject Areas tab, but when you remove a task it is only removed from the subject area until you reassemble the subject area.
DAC identifies the source tables for the tasks identified in the previous process and adds these tables to the subject area.
DAC performs this process recursively until all necessary tasks have been added to the subject area. A task is added to the subject area only once, even if it is associated with several tables in the subject area. DAC then expands or trims the total number of tasks based on the configuration rules, which are defined as configuration tags. This process can be resource intensive because DAC loads all of the objects in the source system container into memory before parsing.
Yes, at task level, in execution type, select SQL file. As a bonus to this wer, this article explains how to run store procedures in DAC.
For better performance management, such as creating index, dropping index, truncating before load. Without DAC a custom ETL process will be needed, which has to survive the upgrate.
DAC export and import are primarily used for backup or repository metadata. The logical system and runtime objects can facilitate essential and commodity.
The merge and update option for DAC version can be carried out by using the merge and update process which is present in the DAC server. The Repository Upgrade is also used for updating DAC. You can use the simplified refresh base from the refresh base menu. It would allow you to update the repository of DAC from an earlier release of Oracle BI. The applications related to a new version can be carried out by comparing and facilitating a report. On the other hand, the base of replace or substitution is carried out when there is a conversion from the older version to a newer one. Quite interestingly, the peer to peer merge can align various DAC instances of repositories related to Data Infrastructure. The base of refresh can also be used for updating the Business Intelligence Applications which are usually present on the DAC server.
The dates related to refresh are usually tracked for the tables. The tables can be a primary source or can also be a prime target. These primary and source target tables are typically based on the completed trial of particular pl related to accomplishments. The DAC can even run the full load command for the assignments that are based on a table which is a primary source of a target. It also runs the entire load of powers that are usually assigned to duties in case the date of refresh against the table is null. In case there are a lot of multiple sources, the refresh dates would always trigger an incremental load. DAC would run the full load command structure, in fact, the source tables have no refresh dates.
One you register informatica server in Dac client.
Based on tasks source/target table, Task phase (extract dim, load fact etc) and 'truncate always' properties, to run them in particular order, create task group.
The wer is yes. However, it can be facilitated only when the accomplishment process is not loading into the same table.
It is vital to note that a subject area can be defined by assigning a fact table. Moreover, a subject area can also be specified by assigning a set of fact tables.
When a subject area is adequately defined, DAC also performs the below-mentioned procedures to assess the essential tasks related to data warehousing.
DAC is involved with the identification of the source tables for the tasks that are being identified in processes previous to the one that was called for to carry out
data warehousing. DAC also adds tables to the places of a subject.
With the help of DAC, one can also eliminate an assignment from a particular subject area by using add or remove command. These commands are located on the tasks tab.
It is also found in the places of a subject tab. However, when someone deletes a particular assignment, it gets removed only from the area of a subject until and unless you refigure the data patterns that are located in the places of matter.
It is interesting to note that the tasks that are automatically assigned by DAC are specified to a subject area. It can be done so by selecting the Inactive checkbox.
It is located in the places of a subject tab.
Moreover, DAC also performs this process in a recursive manner. It is also important to note that until all necessary assignments are completed, they are usually added to the tab that is located in the subject area.
A task is generally added to the area of the subject only once. DAC also expands or lowers the number of functions that are based on the specifications related to configuration.
This technique or process can be resource intensive as DAC installs a wide variety of data objects into the system before the memory of the server is exhausted.
DAC also identifies the table of dimensions that are involved with the facts and usually adds these data tables to the places of a subject.
It can also correctly locate the related data tables such as aggregates that are always associated with the schedules of dimensions. You can also opt to add them to the subject area of proportion.
Subject Areas -- A logical grouping of tables related to a particular subject or application context. It also includes the tasks that are associated with the tables,
as well as the tasks required to load the tables. Subject areas are assigned to execution pl, which can be scheduled for full or incremental loads
tables -- Phsyical tables in DB
Indexes -- Just like your physical DB indexes
Tasks -- Unit of work for loading tables
Task groups ---- Grouping of tasks that can be bundled to run as a group
Execution pl -- A data trformation pl defined on subject areas that needs to be trformed at certain frequencies of time
Schedules -- Determine how often execution plan runs.
Use Informatica scheduler.
The file of authentication usually authenticates the database in which the repository lies. On the other hand, if you opt for creating an authentication file, then you have the liberty to specify the particular table and password for a specified set of the database.
According to Oracle Document:
During an ETL execution, DAC reads and evaluates all parameters associated with that ETL run, including static and runtime parameters defined in DAC, parameters held in flat files, and parameters defined externally to DAC. DAC consolidates all the parameters for the ETL run, deduplicates any redundant parameters, and then creates an individual parameter file for each Informatica session. This file contains the evaluated name-value pairs for all parameters, both static and runtime, for each workflow that DAC executes. The parameter file contains a section for each session under a workflow. DAC determines the sessions under a workflow during runtime by using the Informatica pmrep function ListObjectDependencies.
The naming convention for the parameter file is
DAC writes this file to a location specified in the DAC system property InformaticaParameterFileLocation. The location specified by the property informaticaParameterFileLocation must be the same as the location specified by the Informatica parameter property $PMSourcefileDir.
The performance compared to Micro ETL pl are also known as ETL techniques that can be scheduled at the fixed set of intervals. It is important to note that these micro pl related to accomplishments can be classified into half hourly and hourly basis. They are associated with the management of subsets or small places of a subject. The DAC server can also track the time of refresh for various types of tables in the pl related to accomplishments of the micro ETL.
It does so by executing the pl from different pl related to execution. It then utilizes the dates of refresh in the process pertaining to change capture. One can build and run them by creating a copy of the subject area. You can then opt for deactivating the unwanted assignments and can facilitate the creation of new pl related to accomplishments for this area of a subject.
According to Oracle Document:
When you configure a connection to the DAC Repository, the configuration process includes creating a new authentication file or selecting an existing authentication file. The authentication file authenticates the database in which the repository resides. If you create a new authentication file, you will specify the table owner and password for the database.
You can run two DAC servers on the same machine as long as they are listening on different ports and pointing to two different repositories.
Modify the refresh date to be 12/12/2011.
A way to import or export DAC repository metadata for upgrade or backup. Logical, System, runtime objects can be import/export.
DAC server and DAC Client. They must co-locate with Informatica Integration service, repository service and Informatica repository.
Yes. only if the execution plan are not loading into the same table or using the same phyiscal table source.
According to Oracle Document:
This type of execution plan extracts data from multiple instances of the same source system. For example, a business might have an instance of Oracle EBS 11i in one location and time zone and another instance of Oracle EBS 11i in another location and time zone. In such cases, the timing of data extraction from the different instances can be staggered to meet your business requirements.
This type of execution plan extracts data from one or more instances of dissimilar source systems. For example, a business might have an instance of Siebel 7.8 in one location, an instance of Oracle EBS 11i in another location, and a second instance of Oracle EBS 11i in yet a third location. You can also stagger the timing of data extraction when you use this type of execution plan.
DAC comprises of the DAC client and the DAC server. It is important to note they must realign themselves with Information Integration Service and Information Repository.
In DAC Client, toolbar, click email recipient, then in Tools--> DAC Server setup, Email configuration.
The table refers to override the default behavior for assessing and truncating various tables that are being assigned to a particular type of database. On the other hand, Task Action refers to the fact that one can add multiple types of new functionalities related to the behavior of the tasks. It comprises of failure action, success action, and failure restart. Moreover, index action refers to override the practice of creating and dropping indexes.
There exist multiple kinds of DAC repository objects that are essential in order to make sure that the data warehousing tasks are being accomplished in a proper manner.
The following are the various objects related to the DAC Repository:
A different plan related to accomplishments refers to the fact that it can extract data from one or more chances of source systems that are dissimilar. For instance, a business organization can have a score of Siebel 7.8 in one location. On the other hand, an example of Oracle EBS 11 can be in another position. One can also opt for staggering the timing of data extraction when the professional is using this type of pl related to accomplishments.
On the other side, homogenous pl related to accomplishments also pulls database systems from various instances of the system that has originated from the same source. A suitable example, in this case, can be in the form of the fact that a business can have Oracle EBS 11 in one place and time zone and another portion of EBS 11 in another position and time zone. In the cases mentioned above, data extraction timing can be altered so that the business needs of an organization can be met.
The SQL script can be executed with the help of the DAC server. However, it can be implemented only at the task level. It can be accomplished by selecting the SQL file in the process of execution.
It is important to note that the procedures related to Micro ETL can cause problems related to data inconsistencies. It can also create a massive issue of the availability of data and load on the database pertaining to tractional. One should be aware of the factors that can because the Micro ETL pl to produce wrong information.
In the case of the schemas that share a star hierarchical structure and when one representation is deleted from an assessment plan, the assignments can be inaccurate. An example can be provided in this context to strengthen this fact. If the fact table of the person is refreshed on a frequent basis when compared to the fact table of Revenue, the entire types of diagrams can also facilitate the production of results that are inconsistent.
On the other hand, if someone deletes the table of dimensions from a Pl related to accomplishments related to ETL, the outside related to the tables of facts would always point to the unspecified rows. The key references would be solved in case the pl related to accomplishments are being run on a DAC server. The users of the reports referred to data warehousing should be aware of various inconsistencies.
On the other side, if you do not comprise the tables of aggregates in the pl related to execution, the information that usually utilizes data would remain inconsistent. On the other hand, if the tables of totals are comprised of the program related to performance, the pl about accomplishments are carried out for processes described to ETL. The tables of a hierarchy are also constructed in the event of every Plan related to achievements related to ETL. In case you avoid overfilling the counters during the ETL processes, the inconsistencies related to data would always occur.
According to Oracle document:
Micro ETL execution pl are ETL processes that you schedule at very frequent intervals, such as hourly or half-hourly. They usually handle small subject areas or subsets of larger subject areas. The DAC tracks refresh dates for tables in micro ETL execution pl separately from other execution pl and uses these refresh dates in the change capture process.
in design -- subject areas, create copy of subject area, inactive the unwanted tasks and create new execution plan for this subject area.
You have to understand what are the DAC parameters and the purpose of each. For example, Initial_extract_date can be modified when configure for initial full load, so the value for initial extract date will be used to filter out records from the source that are older than this date.
According to Oracle Document:
Index action: Override the default behavior for dropping and creating indexes.
Table action: Override the default behavior for truncating and analyzing tables.
Task action: Can add new functionality of task behavior, such as precedinf action, success action, failure action, upon failure restart.
According to Oracle Document:
Refresh dates are tracked only for tables that are either a primary source or a primary target on tasks in a completed run of an execution plan. The DAC runs the full load command for tasks on which a table is a primary source or target if the refresh date against the table is null. When there are multiple primary sources, the earliest of the refresh dates will trigger a full load or an incremental load. If any one of the primary source tables has no refresh date, then the DAC will run the full load command.
Just drop and recreate index.