ETL Testing Multiple-Choice Questions (MCQs)

ETL tools extract the data from all the different data sources, transforms the data and and loads it into a data warehouse.

ETL Testing MCQs: This section contains multiple-choice questions and answers on the various topics of ETL Testing. Practice these MCQs to test and enhance your skills on ETL Testing.

List of ETL Testing MCQs

1. Using an ____ tool, data is extracted from multiple data sources, transformed, and loaded into a data warehouse after joining fields, calculating, and removing incorrect data fields.

  1. ETL
  2. TEL
  3. LET
  4. LTE

Answer: A) ETL

Explanation:

Using an ETL tool, data is extracted from multiple data sources, transformed, and loaded into a data warehouse after joining fields, calculating, and removing incorrect data fields.

Discuss this Question


2. After business ____, ETL testing ensures that the data has been loaded accurately from a source to a destination.

  1. Information
  2. Transformation
  3. Transfusion
  4. Transfiction

Answer: B) Transformation

Explanation:

After business transformation, ETL testing ensures that the data has been loaded accurately from a source to a destination.

Discuss this Question


3. During ETL, various stages of data are verified and used at ____.

  1. Source
  2. Destination
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

During ETL, various stages of data are verified and used both at source and destination.

Discuss this Question


4. What is the full form of ETL?

  1. Extract Transformation and Load
  2. Extract Transformation and Lead
  3. Extract Transfusion and Load
  4. Extract Transfusion and Lead

Answer: A) Extract Transformation and Load

Explanation:

The full form of ETL is Extract Transformation and Load.

Discuss this Question


5. To fetch data from one database and place it in another, ETL combines all ____ database functions into one tool.

  1. Two
  2. Three
  3. Four
  4. Five

Answer: B) Three

Explanation:

To fetch data from one database and place it in another, ETL combines all three database functions into one tool.

Discuss this Question


6. Getting information from a database is called ____ (reading).

  1. Extracting
  2. Transforming
  3. Loading
  4. None

Answer: A) Extracting

Explanation:

Getting information from a database is called extracting (reading).

Discuss this Question


7. The process of ____ data involves converting it from one form to another.

  1. Extracting
  2. Transforming
  3. Loading
  4. None

Answer: B) Transforming

Explanation:

The process of transforming data involves converting it from one form to another.

Discuss this Question


8. In addition to using ____, the data can be combined with other data to undergo transformation.

  1. Rules
  2. Lookup tables
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

In addition to using rules or lookup tables, the data can be combined with other data to undergo transformation.

Discuss this Question


9. Writing data into a database is called ____.

  1. Extracting
  2. Transforming
  3. Loading
  4. None

Answer: C) Loading

Explanation:

Writing data into a database is called loading.

Discuss this Question


10. Using ETL, you can ____ the data from multiple sources and blend them together according to your needs.

  1. Extract
  2. Transform
  3. Load
  4. All of the above

Answer: D) All of the above

Explanation:

Using ETL, you can extract, transform, and load the data from multiple sources and blend them together according to your needs.

Discuss this Question


11. ETL is often used to build a -

  1. Data Center
  2. Data Warehouse
  3. Data Set
  4. Data Care Center

Answer: B) Data Warehouse

Explanation:

ETL is often used to build a Data Warehouse.

Discuss this Question


12. As part of the ETL process, data from a source system is ____ into a format suitable for storing in a data warehouse or another system.

  1. Extracted
  2. Converted
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

As part of the ETL process, data from a source system is extracted and converted into a format suitable for storing in a data warehouse or another system.

Discuss this Question


13. In today's environment, ETL is becoming more and more necessary for many reasons, including:

  1. In order to make critical business decisions, companies use ETL to analyze their business data.
  2. Data warehouses are repositories where data is shared.
  3. In ETL, data is moved from a variety of sources into a data warehouse.
  4. All of the above

Answer: D) All of the above

Explanation:

In today's environment, ETL is becoming more and more necessary for many reasons, including:

  1. In order to make critical business decisions, companies use ETL to analyze their business data.
  2. Data warehouses are repositories where data is shared.
  3. In ETL, data is moved from a variety of sources into a data warehouse.

Discuss this Question


14. What is TRUE about ETL?

  1. The ETL process allows the source and target systems to compare sample data.
  2. As part of the ETL process, complex transformations can be performed and additional storage space is required.
  3. Earlier, we defined ETL as a process of converting source data into target data and manipulating it.
  4. All of the above

Answer: D) All of the above

Explanation:

The things that are TRUE about ETL are -

  1. The ETL process allows the source and target systems to compare sample data.
  2. As part of the ETL process, complex transformations can be performed and additional storage space is required.
  3. Earlier, we defined ETL as a process of converting source data into target data and manipulating it.

Discuss this Question


15. A ____ is loaded with data through the ETL process.

  1. Data Mart
  2. Data Warehouse
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

A data mart or data warehouse is loaded with data through the ETL process

Discuss this Question


16. To facilitate business analysis, our data warehouse needs to be ____ regularly

  1. Extracted
  2. Transformed
  3. Loaded
  4. None

Answer: C) Loaded

Explanation:

To facilitate business analysis, our data warehouse needs to be loaded regularly.

Discuss this Question


17. To avoid affecting the source system's performance, the ____ occurs on the ETL server or staging area.

  1. Loading
  2. Extraction
  3. Transformation
  4. None

Answer: C) Transformation

Explanation:

To avoid affecting the source system's performance, the transformation occurs on the ETL server or staging area.

Discuss this Question


18. Before data is moved to the warehouse, the extracted data can be validated in the ____ area.

  1. Staging
  2. Staggering
  3. Studying
  4. None

Answer: A) Staging

Explanation:

Before data is moved to the warehouse, the extracted data can be validated in the staging area.

Discuss this Question


19. How many methods are there to extract the data?

  1. Two
  2. Three
  3. Four
  4. Five

Answer: B) Three

Explanation:

There are three methods to extract the data.

Discuss this Question


20. Which of the following is/are the method(s) to extract the data?

  1. FULL Extraction
  2. Partial Extraction - Without Update Notification
  3. Partial Extraction - With Update Notification
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the methods to extract the data -

  1. FULL Extraction
  2. Partial Extraction - Without Update Notification
  3. Partial Extraction - With Update Notification

Discuss this Question


21. Whatever extraction method we use, the source system should not be affected in terms of ____ time.

  1. Performance
  2. Response
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

Whatever extraction method we use, the source system should not be affected in terms of performance or response time.

Discuss this Question


22. Which of the following is/are the validation(s) using the extraction(s)?

  1. Check the source data against the record
  2. Ensure that the data type is correct
  3. There will be a check to see if all the keys are there
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the validations using the extractions -

  1. Check the source data against the record
  2. Ensure that the data type is correct
  3. There will be a check to see if all the keys are there

Discuss this Question


23. It is not possible to use the extracted data as originally formatted from the ____ server.

  1. Source
  2. Site
  3. Storage
  4. None

Answer: A) Source

Explanation:

It is not possible to use the extracted data as originally formatted from the source server.

Discuss this Question


24. ____ refers to data that does not need to be transformed.

  1. Direct Move
  2. Pass-through data
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

Direct move or pass-through data refers to data that does not need to be transformed.

Discuss this Question


25. Which of the following is/are the validation point(s) during the transformation?

  1. Filtering
  2. Conversion of character sets and encodings
  3. Checking the threshold and validity of data
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the validation points during the transformation -

  1. Filtering
  2. Conversion of character sets and encodings
  3. Checking the threshold and validity of data

Discuss this Question


26. Which of the following is the last step of the ETL process?

  1. Extraction
  2. Transformation
  3. Loading
  4. None

Answer: C) Loading

Explanation:

The following is last step of the ETL process is Loading.

Discuss this Question


27. Which of the following is/are the type(s) of loading?

  1. Initial Load
  2. Incremental Load
  3. Full Refresh
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the types of loading -

  1. Initial Load
  2. Incremental Load
  3. Full Refresh

Discuss this Question


28. Loads need to be ____ according to server performance by the admin of the data warehouse.

  1. Monitored
  2. Resumed
  3. Canceled
  4. All of the above

Answer: D) All of the above

Explanation:

Loads need to be monitored, resumed, and canceled according to server performance by the admin of the data warehouse.

Discuss this Question


29. With a ____, all tables are erased and reloaded with new information.

  1. Initial Load
  2. Incremental Load
  3. Full Refresh
  4. None of the above

Answer: C) Full Refresh

Explanation:

With a Full Refresh, all tables are erased and reloaded with new information.

Discuss this Question


30. The term is ETL now extended to ____ or Extract, Monitor, Profile, Analyze, Cleanse, Transform, and Load.

  1. E-MPAC-TL
  2. E-PAC-TL
  3. E-MAP-TL
  4. E-MPAA-TL

Answer: A) E-MPAC-TL

Explanation:

The term is now extended to E-MPAC-TL or Extract, Monitor, Profile, Analyze, Cleanse, Transform, and Load.

Discuss this Question


31. During ____, the main goal is to capture data as quickly as possible from a system while minimizing the inconvenience to the system.

  1. Extraction
  2. Transformation
  3. Loading
  4. None

Answer: A) Extraction

Explanation:

During extraction, the main goal is to capture data as quickly as possible from a system while minimizing the inconvenience to the system.

Discuss this Question


32. ____ the data requires integrating the data and finally presenting the combined data to the end-user community via the front-end tools.

  1. Transforming
  2. Loading
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

Transforming and loading the data requires integrating the data and finally presenting the combined data to the end-user community via the front-end tools.

Discuss this Question


33. Compared to traditional ETL, ETL ____ the time it takes for sources and targets to develop.

  1. Extends
  2. Reduces
  3. Exceeds
  4. Manipulates

Answer: B) Reduces

Explanation:

Compared to traditional ETL, ETL reduces the time it takes for sources and targets to develop.

Discuss this Question


34. ____ consists of a process between stages that is defined according to the needs, and it can verify the quality of the product

  1. Quantity Assurance
  2. Quality Assurance
  3. Quantity Attribution
  4. Quality Attribution

Answer: B) Quality Assurance

Explanation:

Quality Assurance consists of a process between stages that is defined according to the needs, and it can verify the quality of the product.

Discuss this Question


35. In ____, analysis and validation of the data pattern and formats will be performed, as well as identification and validation of redundant data across data sources to determine the actual content, structure, and quality of the data.

  1. Data Profiling
  2. Data Analysis
  3. Source Analysis
  4. Cleansing

Answer: A) Data Profiling

Explanation:

In data profiling, analysis and validation of the data pattern and formats will be performed, as well as identification and validation of redundant data across data sources to determine the actual content, structure, and quality of the data.

Discuss this Question


36. It is important to focus on the surroundings of the sources in ____ analysis, so that the documentation of the sources can be obtained.

  1. Data
  2. Source
  3. Profile
  4. None

Answer: B) Source

Explanation:

It is important to focus on the surroundings of the sources in source analysis, so that the documentation of the sources can be obtained.

Discuss this Question


37. Based on the Metadata of a set of predefined rules, errors found can be fixed in ____.

  1. Data Analysis
  2. Source Analysis
  3. Cleansing
  4. Data Profiling

Answer: C) Cleansing

Explanation:

Based on the Metadata of a set of predefined rules, errors found can be fixed in cleansing.

Discuss this Question


38. An extended ETL concept, E-MPAC-TL is designed to meet the requirements while taking into account the realities of the systems, ____, constraints, and most importantly, the data itself.

  1. Tools
  2. Metadata
  3. Technical Issues
  4. All of the above

Answer: D) All of the above

Explanation:

An extended ETL concept, E-MPAC-TL is designed to meet the requirements while taking into account the realities of the systems, tools, metadata, technical issues, constraints, and most importantly, the data itself.

Discuss this Question


39. ETL testing is also known as -

  1. Table balancing
  2. Product Reconciliation
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

ETL testing is also known as Table balancing or Product Reconciliation

Discuss this Question


40. An ETL test ensures the data loaded after transformation is accurate after it has been ____ from a source to a destination.

  1. Added
  2. Loaded
  3. Deleted
  4. Altered

Answer: B) Loaded

Explanation:

An ETL test ensures the data loaded after transformation is accurate after it has been loaded from a source to a destination.

Discuss this Question


41. ETL testing is performed in ____ stages.

  1. Two
  2. Three
  3. Four
  4. Five

Answer: D) Five

Explanation:

ETL testing is performed in five stages.

Discuss this Question


42. Which of the following is/are the stage(s) the ETL testing?

  1. Data recovery
  2. Build and populate data
  3. Build reports
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the stages of the ETL testing -

  1. Data recovery
  2. Build and populate data
  3. Build reports

Discuss this Question


43. Which of the following is/are the type(s) of ETL testing?

  1. New Data Warehouse Testing
  2. Production Validation Testing
  3. Application Upgrade
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the types of ETL testing -

  1. New Data Warehouse Testing
  2. Production Validation Testing
  3. Application Upgrade

Discuss this Question


44. Customer requirements and different sources of data are taken into account in ____.

  1. New Data Warehouse Testing
  2. Production Validation Testing
  3. Application Upgrade
  4. Metadata Testing

Answer: A) New Data Warehouse Testing

Explanation:

Customer requirements and different sources of data are taken into account in New Data Warehouse Testing.

Discuss this Question


45. Which of the following is/are the group that plays the responsibility in testing New Data Warehouses -

  1. Business Analyst
  2. Infrastructure People
  3. QA Testers
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the group that plays responsibility in testing New Data Warehouses -

  1. Business Analyst
  2. Infrastructure People
  3. QA Testers

Discuss this Question


46. What is the responsibility of a Business Analyst?

  1. Requirements are gathered and documented by the business analyst.
  2. The test environment is set up by Business Analyst people.
  3. These plans and scripts are developed by Business Analysts and then executed by them.
  4. Each module is tested by a Business Analyst.

Answer: A) Requirements are gathered and documented by the business analyst.

Explanation:

Requirements are gathered and documented by the business analyst.

Discuss this Question


47. The ____ develops test plans and scripts and executes these plans and scripts.

  1. Infrastructure People
  2. QA Testers
  3. Developers
  4. Users

Answer: B) QA Testers

Explanation:

The QA tester develops test plans and scripts and executes these plans and scripts.

Discuss this Question


48. Performance and stress tests are conducted by ____.

  1. Infrastructure People
  2. Developers
  3. Users
  4. Database Administrators

Answer: D) Database Administrators

Explanation:

Performance and stress tests are conducted by Database Administrators.

Discuss this Question


49. What is the full form of UAT?

  1. User Analyst Testing
  2. User Acceptance Testing
  3. User Attribute Testing
  4. None

Answer: B) User Acceptance Testing

Explanation:

The full form of UAT is User Acceptance Testing.

Discuss this Question


50. Whenever data is moved into production systems, ____ tests are performed.

  1. Production Validation
  2. Source to Target
  3. Metadata
  4. Data Accuracy

Answer: A) Production Validation

Explanation:

Whenever data is moved into production systems, production validation tests are performed.

Discuss this Question


51. To ensure that the data don't compromise production systems, ____ automates ETL testing and management.

  1. Informatica Data Validation
  2. Irrelevant Data Validation
  3. Informatica Duration Validation
  4. Irrelevant Duration Validation

Answer: A) Informatica Data Validation

Explanation:

To ensure that the data don't compromise production systems, Informatica Data Validation automates ETL testing and management.

Discuss this Question


52. Validating the data values transformed to the expected data values is done through ____ testing.

  1. Source to target
  2. Metadata
  3. Data Accuracy
  4. Data Transformation

Answer: A) Source to target

Explanation:

Validating the data values transformed to the expected data values is done through source-to-target testing.

Discuss this Question


53. Tests are automatically generated for ____, which saves test developers' time.

  1. Data Accuracy
  2. Data Transformation
  3. Application Upgrades
  4. Data Quality

Answer: C) Application Upgrades

Explanation:

Tests are automatically generated for Application Upgrades, which saves test developers' time.

Discuss this Question


54. When an application is upgraded, the extracted data from the old application is checked against the new application's data to ensure that they are ____.

  1. Different
  2. Identical
  3. Similar
  4. Varied

Answer: B) Identical

Explanation:

When an application is upgraded, the extracted data from the old application is checked against the new application's data to ensure that they are identical.

Discuss this Question


55. As part of ____ testing, types of data, lengths of data, and indexes and constraints are measured.

  1. Metadata
  2. Data Accuracy
  3. Data Transformation
  4. Data Quality

Answer: A) Metadata

Explanation:

As part of metadata testing, types of data, lengths of data, and indexes and constraints are measured.

Discuss this Question


56. We test the data ____ to ensure that data loading and transformation is accurate.

  1. Accuracy
  2. Transformation
  3. Quality
  4. None

Answer: A) Accuracy

Explanation:

We test the data accuracy to ensure that data loading and transformation is accurate.

Discuss this Question


57. Which of the following testing(s) is/are included in Data Quality Testing?

  1. Syntax
  2. Reference
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

The following testings are included in Data Quality Testing -

  1. Syntax
  2. Reference

Discuss this Question


58. Invalid characters, invalid character patterns, or improper upper- or lower-case order will result in dirty data being reported by ____ tests.

  1. Syntax
  2. Reference
  3. Accuracy
  4. Transformation

Answer: A) Syntax

Explanation:

Invalid characters, invalid character patterns, or improper upper- or lower-case order will result in dirty data being reported by syntax tests.

Discuss this Question


59. A data integrity test is conducted for ____ testing when new data is added to old data.

  1. Incremental ETL
  2. GUI/Navigation
  3. Migration
  4. Report

Answer: A) Incremental ETL

Explanation:

A data integrity test is conducted for incremental ETL testing when new data is added to old data.

Discuss this Question


60. After data has been inserted and updated during an incremental ETL process, incremental testing verifies the system is ____ properly.

  1. Deadlocked
  2. Still Working
  3. Crashed
  4. Initiated

Answer: B) Still Working

Explanation:

After data has been inserted and updated during an incremental ETL process, incremental testing verifies the system is still working properly.

Discuss this Question


61. ____ reports are tested for navigation and GUI aspects by GUI/Navigation Testing.

  1. Front-end
  2. Back-end
  3. Both A and B
  4. None of the above

Answer: A) Front-end

Explanation:

Front-end reports are tested for navigation and GUI aspects by GUI/Navigation Testing.

Discuss this Question


62. An existing data warehouse is used in ____ Testing, and ETL is used to process the data.

  1. Migration
  2. Report
  3. Incremental ETL
  4. GUI

Answer: A) Migration

Explanation:

An existing data warehouse is used in Migration Testing, and ETL is used to process the data.

Discuss this Question


63. Which of the following steps is/are included in Migration Testing?

  1. Design and validation tests
  2. Setting up the test environment
  3. Executing the validation test
  4. All of the above

Answer: D) All of the above

Explanation:

The following steps are included in Migration Testing -

  1. Design and validation tests
  2. Setting up the test environment
  3. Executing the validation test

Discuss this Question


64. ____ validation should be done for reports.

  1. Data
  2. Layout
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

Data validation and layout validation should be done for reports.

Discuss this Question


65. Which of the following task(s) is/are performed in ETL testing?

  1. The ability to understand and report on data
  2. Source-to-target mapping
  3. Analyzes the source data for errors
  4. All of the above

Answer: D) All of the above

Explanation:

The following tasks are performed in ETL testing -

  1. The ability to understand and report on data
  2. Source-to-target mapping
  3. Analyzes the source data for errors

Discuss this Question


66. ____ testing is typically performed on transactional systems, whereas ____ testing is typically performed on data in a data warehouse.

  1. Database, ETL
  2. ETL, Database
  3. Database, ELT
  4. ELT, Database

Answer: A) Database, ETL

Explanation:

Database testing is typically performed on transactional systems, whereas ETL testing is typically performed on data in a data warehouse.

Discuss this Question


67. The following operations are involved in ETL testing:

  1. Data movement validation between the source and target systems.
  2. The source and target systems should be verified for data counts.
  3. During ETL testing, the transformations and extractions are verified according to requirements.
  4. All of the above

Answer: D) All of the above

Explanation:

The following operations are involved in ETL testing:

  1. Data movement validation between the source and target systems.
  2. The source and target systems should be verified for data counts.
  3. During ETL testing, the transformations and extractions are verified according to requirements.

Discuss this Question


68. During ETL testing, ____ are verified to ensure that they are preserved.

  1. Joins
  2. Keys
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

During ETL testing, joins and keys are verified to ensure that they are preserved.

Discuss this Question


69. In database testing, the focus is on ensuring that data is ____.

  1. Accurate
  2. Correct
  3. Valid
  4. All of the above

Answer: D) All of the above

Explanation:

In database testing, the focus is on ensuring that data is accurate, correct, and valid.

Discuss this Question


70. The following operations are performed during database testing:

  1. During database testing, data values in columns are verified to ensure they are valid.
  2. Tests are conducted on databases to determine whether primary or foreign keys are maintained.
  3. Testing the database verifies if the column has any missing data.
  4. All of the above

Answer: D) All of the above

Explanation:

The following operations are performed during database testing:

  1. During database testing, data values in columns are verified to ensure they are valid.
  2. Tests are conducted on databases to determine whether primary or foreign keys are maintained.
  3. Testing the database verifies if the column has any missing data.

Discuss this Question


71. A performance test is conducted to determine if ETL systems can handle ____ at the same time.

  1. Multiple Users
  2. Transactions
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

A performance test is conducted to determine if ETL systems can handle multiple users and transactions at the same time.

Discuss this Question


72. A ____ compares the data between a source system and a target system without transforming the data in either system.

  1. Value Comparison
  2. Value Compression
  3. Value Compromise
  4. Value Contraction

Answer: A) Value Comparison

Explanation:

A value comparison compares the data between a source system and a target system without transforming the data in either system.

Discuss this Question


73. The data accuracy of both the source and the target can be checked with a set of ____ operators.

  1. Relational
  2. Rational
  3. SQL
  4. Database

Answer: C) SQL

Explanation:

The data accuracy of both the source and the target can be checked with a set of SQL operators.

Discuss this Question


74. ____ the distinct values of critical data columns in the source and target systems is a good way to verify the integrity of critical data columns.

  1. Examining
  2. Comparing
  3. Differentiating
  4. None of the above

Answer: B) Comparing

Explanation:

Comparing the distinct values of critical data columns in the source and target systems is a good way to verify the integrity of critical data columns.

Discuss this Question


75. A single SQL query cannot convert data because ____.

  1. It can compare the output with the target.
  2. It can't compare the output with the target.
  3. It can't compare the input with the target.
  4. It can compare the input with the target.

Answer: B) It can't compare the output with the target.

Explanation:

A single SQL query cannot convert data because it can't compare the output with the target.

Discuss this Question


76. Data Transformation ETL testing involves writing ____ SQL queries for each row in order to confirm the rules of the transformation.

  1. Two
  2. Three
  3. Four
  4. Multiple

Answer: D) Multiple

Explanation:

Data Transformation ETL testing involves writing multiple SQL queries for each row in order to confirm the rules of the transformation.

Discuss this Question


77. It is imperative that we select sufficient and representative data from the source system to perform the successful ____ testing.

  1. Database
  2. ETL
  3. Both A and B
  4. None of the above

Answer: B) ETL

Explanation:

It is imperative that we select sufficient and representative data from the source system to perform successful ETL testing.

Discuss this Question


78. The document(s) that the ETL tester always uses during the testing process is/are:

  1. ETL Mapping Sheets
  2. DB Schema of Source (Target)
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

The document(s) that the ETL tester always uses during the testing process is/are:

  1. ETL Mapping Sheets
  2. DB Schema of Source (Target)

Discuss this Question


79. A ____ contains all the columns and their lookups in reference tables for both source and destination tables.

  1. Mapping Sheet
  2. DB Schema of Source
  3. DB Schema of Target
  4. None of the above

Answer: A) Mapping Sheet

Explanation:

A mapping sheet contains all the columns and their lookups in reference tables for both source and destination tables.

Discuss this Question


80. Which of the following is/are the type(s) of ETL Bugs?

  1. Calculation
  2. User Interface
  3. Load Condition
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the types of ETL Bugs -

  1. Calculation
  2. User Interface
  3. Load Condition

Discuss this Question


81. ____, spelling check, and other issues related to the Graphical User Interface of an application are examples of User Interface bugs.

  1. Color
  2. Font Style
  3. Navigation
  4. All of the above

Answer: D) All of the above

Explanation:

Color, font style, navigation, spelling check, and other issues related to the Graphical User Interface of an application are examples of User Interface bugs.

Discuss this Question


82. As a result of the ____ bug, invalid values are being taken by the application and valid values are being rejected.

  1. Input-output
  2. Boundary value analysis
  3. Calculation
  4. Load Condition

Answer: A) Input-output

Explanation:

As a result of the input-output bug, invalid values are being taken by the application and valid values are being rejected.

Discuss this Question


83. Bugs that check for minimums and maximums are called ____ bugs.

  1. Calculation
  2. Load Condition
  3. Boundary value analysis
  4. Race Condition

Answer: C) Boundary value analysis

Explanation:

Bugs that check for minimums and maximums are called boundary value analysis bugs.

Discuss this Question


84. Mathematical errors show up in ____ bugs, and the results are usually inaccurate.

  1. Load Condition
  2. Race Condition
  3. Hardware
  4. Calculation

Answer: D) Calculation

Explanation:

Mathematical errors show up in calculation bugs, and the results are usually inaccurate.

Discuss this Question


85. Invalid or invalid types are produced by ____ bugs.

  1. Load Condition
  2. Race Condition
  3. Equivalence Class Partitioning
  4. Version Control

Answer: C) Equivalence Class Partitioning

Explanation:

Invalid or invalid types are produced by Equivalence Class Partitioning bugs.

Discuss this Question


86. Regression Testing bugs do not indicate what version they came from, as they are usually caused by ____ bugs.

  1. Help Source
  2. Hardware
  3. Version Control
  4. Load Condition

Answer: C) Version Control

Explanation:

Regression Testing bugs do not indicate what version they came from, as they are usually caused by Version Control bugs.

Discuss this Question


87. It is the ETL tester's responsibility to validate the ____ and extract the data from the target table.

  1. Data Sources
  2. Apply Transformation
  3. Load the data into the target table
  4. All of the above

Answer: D) All of the above

Explanation:

It is the ETL tester's responsibility to validate the data sources, apply transformation logic, load the data into the target table, and extract the data from the target table.

Discuss this Question


88. ETL testers has/have the following responsibility/ies:

  1. In the source system, verify the table.
  2. Apply Transformation Logic
  3. Data Loading
  4. All of the above

Answer: D) All of the above

Explanation:

ETL testers have the following responsibilities:

  1. In the source system, verify the table.
  2. Apply Transformation Logic
  3. Data Loading

Discuss this Question


89. Following is/are the type(s) of operations involved in verifying the table in the source system:

  1. Count Check
  2. Data Type Check
  3. Remove Duplicate Data
  4. All of the above

Answer: D) All of the above

Explanation:

Following are the types of operations involved in verifying the table in the source system:

  1. Count Check
  2. Data Type Check
  3. Remove Duplicate Data

Discuss this Question


90. What is/are the advantage(s) of ETL Testing?

  1. During ETL testing, data can be extracted from or received from any data source simultaneously.
  2. In ETL, heterogeneous data sources can be loaded into a single generalized (frequent)/different target simultaneously.
  3. It is possible to load different types of goals simultaneously using ETL.
  4. All of the above

Answer: D) All of the above

Explanation:

The advantages of ETL Testing are -

  1. During ETL testing, data can be extracted from or received from any data source simultaneously.
  2. In ETL, heterogeneous data sources can be loaded into a single generalized (frequent)/different target simultaneously.
  3. It is possible to load different types of goals simultaneously using ETL.

Discuss this Question


91. An ____ procedure is capable of extracting business data from a variety of sources and loading it into a different target as desired.

  1. ETL
  2. Database
  3. Both A and B
  4. None of the above

Answer: A) ETL

Explanation:

An ETL procedure is capable of extracting business data from a variety of sources and loading it into a different target as desired.

Discuss this Question


92. What is/are the disadvantages of ETL Testing?

  1. ETL testing has the disadvantage of requiring us to be database analysts or developers with data-oriented experience.
  2. On-demand or real-time access is not ideal when we need a fast response.
  3. There will be a delay of months before any ETL testing can be done.
  4. All of the above

Answer: D) All of the above

Explanation:

The disadvantages of ETL Testing are -

  1. ETL testing has the disadvantage of requiring us to be database analysts or developers with data-oriented experience.
  2. On-demand or real-time access is not ideal when we need a fast response.
  3. There will be a delay of months before any ETL testing can be done.

Discuss this Question


93. What is/are the requisites provided by ETL tools?

  1. Multiple data structures and different platforms, such as mainframes, servers, and databases, can be collected, read, and migrated using ETL tools.
  2. Using an ETL tool is as easy as sorting, filtering, reformatting, merging, and joining data.
  3. A few ETL tools support BI tools and functionality such as transformation scheduling, monitoring, and version control.
  4. All of the above

Answer: D) All of the above

Explanation:

The requisites provided by ETL tools are -

  1. Multiple data structures and different platforms, such as mainframes, servers, and databases, can be collected, read, and migrated using ETL tools.
  2. Using an ETL tool is as easy as sorting, filtering, reformatting, merging, and joining data.
  3. A few ETL tools support BI tools and functionality such as transformation scheduling, monitoring, and version control.

Discuss this Question


94. What is/are the benefit(s) of ETL tools?

  1. Ease of Use
  2. Operational Resilience
  3. Visual Flow
  4. All of the above

Answer: D) All of the above

Explanation:

The benefits of ETL tools are -

  1. Ease of Use
  2. Operational Resilience
  3. Visual Flow

Discuss this Question


95. Data engineers can develop a successful and well-instrumented system with ETL tools that have ____ error handling.

  1. Artificial
  2. Built-in
  3. Natural
  4. None

Answer: B) Built-in

Explanation:

Data engineers can develop a successful and well-instrumented system with ETL tools that have built-in error handling.

Discuss this Question


96. An ETL tool simplifies the task of ____ and integrating multiple data sets when dealing with complex rules and transformations.

  1. Calculating
  2. String Manipulation
  3. Changing Data
  4. All of the above

Answer: D) All of the above

Explanation:

A ETL tool simplifies the task of calculating, string manipulating, changing data, and integrating multiple data sets when dealing with complex rules and transformations.

Discuss this Question


97. ____ can be simplified using ETL tools.

  1. Extraction
  2. Transformation
  3. Loading
  4. All of the above

Answer: D) All of the above

Explanation:

Extraction, transformation, and loading can be simplified using ETL tools.

Discuss this Question


98. Which of the following is/are the type(s) of ETL tools?

  1. Talend Data Integration
  2. Informatica
  3. Kettle
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the types of ETL tools -

  1. Talend Data Integration
  2. Informatica
  3. Kettle

Discuss this Question


99. Which of the following is a cloud-based tool?

  1. Clover ETL
  2. AWS Glue
  3. Jasper ETL
  4. Kettle

Answer: B) AWS Glue

Explanation:

The following is the cloud-based tool - AWS Glue.

Discuss this Question


100. ETL tool function is a ____-layered structure.

  1. One
  2. Two
  3. Three
  4. Four

Answer: C) Three

Explanation:

The ETL tool function is a three-layered structure.

Discuss this Question


101. Which of the following is/are the layer(s) in the ETL tool function?

  1. Staging Layer
  2. Data Integration Layer
  3. Access Layer
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the layers in the ETL tool function -

  1. Staging Layer
  2. Data Integration Layer
  3. Access Layer

Discuss this Question


102. Data extracted from different sources is stored in a ____.

  1. Staging database
  2. Staging layer
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

Data extracted from different sources is stored in a staging database or staging layer.

Discuss this Question


103. A database is created from the data transformed by the ____ Layer.

  1. Data
  2. Staging
  3. Integration
  4. Access

Answer: C) Integration

Explanation:

A database is created from the data transformed by the Integration Layer.

Discuss this Question


104. Facts and aggregate facts are grouped into hierarchical groups in the database referred to as ____.

  1. Dimensions
  2. Data
  3. Dataset
  4. Deadlock

Answer: A) Dimensions

Explanation:

Facts and aggregate facts are grouped into hierarchical groups in the database referred to as dimensions.

Discuss this Question


105. Which of the following is TRUE about RightData?

  1. An online tool for testing ETL/Data integration, RightData is available as a self-service program.
  2. Data can be validated and coordinated between datasets despite differences in data models or types of sources with RightData's interface.
  3. Data platforms with high complexity and large volumes require RightData to work efficiently.
  4. All of the above

Answer: D) All of the above

Explanation:

The following are TRUE about RightData -

  1. An online tool for testing ETL/Data integration, RightData is available as a self-service program.
  2. Data can be validated and coordinated between datasets despite differences in data models or types of sources with RightData's interface.
  3. Data platforms with high complexity and large volumes require RightData to work efficiently.

Discuss this Question


106. ____ testing can be done with the QuerySurge tool.

  1. Data Warehouse
  2. Big Data
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

Data Warehouse and Big Data testing can be done with the QuerySurge tool.

Discuss this Question


107. Which of the following is/are the feature(s) of QuerySurge?

  1. A big data testing and ETL testing tool, QuerySurge automates the testing process.
  2. It automates the manual process and schedules tests for a specific date and time.
  3. Using this tool, you can create test scenarios and test suits along with configurable reports without knowing SQL.
  4. All of the above

Answer: D) All of the above

Explanation:

The following are the features of QuerySurge -.

  1. A big data testing and ETL testing tool, QuerySurge automates the testing process.
  2. It automates the manual process and schedules tests for a specific date and time.
  3. Using this tool, you can create test scenarios and test suits along with configurable reports without knowing SQL.

Discuss this Question


108. Data-centric projects, such as ____, etc., require automated ETL testing tools such as iCEDQ.

  1. Warehouses
  2. Data migrations
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

Data-centric projects, such as warehouses, data migrations, etc., require automated ETL testing tools such as iCEDQ.

Discuss this Question


109. Sources and systems are ____ by iCEDQ.

  1. Verified
  2. Validated
  3. Coordinated
  4. All of the above

Answer: D) All of the above

Explanation:

Sources and systems are verified, validated, and coordinated by iCEDQ.

Discuss this Question


110. What is/are the feature(s) of iCEDQ?

  1. We use iCEDQ to compare millions of files and rows of data when we do ETL testing.
  2. As a result, it is possible to identify exactly which columns and rows contain data errors.
  3. iCEDQ compares the data in memory based on the unique columns in the database.
  4. All of the above

Answer: D) All of the above

Explanation:

The features of iCEDQ are -

  1. We use iCEDQ to compare millions of files and rows of data when we do ETL testing.
  2. As a result, it is possible to identify exactly which columns and rows contain data errors.
  3. iCEDQ compares the data in memory based on the unique columns in the database.

Discuss this Question


111. ETL and end-to-end testing are offered by QualiDI's ____ testing platform.

  1. Non-automated
  2. Semi-automated
  3. Automated
  4. None of the above

Answer: C) Automated

Explanation:

ETL and end-to-end testing are offered by QualiDI's automated testing platform.

Discuss this Question


112. What is/are the feature(s) of QualiDI?

  1. Test cases can be created automatically with QualiDI, and the automated data can be compared with the manual data.
  2. Continuous integration is supported.
  3. Featuring a complex testing cycle, eliminating human error, and managing data quality, QualiDI manages complex BI testing cycles.
  4. All of the aboveA

Answer: D) All of the above

Explanation:

The features of QualiDI are -.

  1. Test cases can be created automatically with QualiDI, and the automated data can be compared with the manual data.
  2. Continuous integration is supported.
  3. Featuring a complex testing cycle, eliminating human error, and managing data quality, QualiDI manages complex BI testing cycles.

Discuss this Question


113. A data migration pipeline involves ____ data from an input source, transforming it, and loading it into an output destination for analysis, reporting, and synchronization (such as a datamart, database, and data warehouse).

  1. Adding
  2. Deleting
  3. Extracting
  4. Modifying

Answer: C) Extracting

Explanation:

A data migration pipeline involves extracting data from an input source, transforming it, and loading it into an output destination for analysis, reporting, and synchronization (such as a datamart, database, and data warehouse).

Discuss this Question


114. What is/are TRUE about ETL Pipelines?

  1. In addition to enterprise data warehouses, subject-specific data marts are also built using ETL pipelines.
  2. As a data migration solution, ETL pipelines are also used when replacing traditional applications with new ones.
  3. Industry-standard ETL tools are usually used to construct ETL pipelines that transform structured data.
  4. All of the above

Answer: D) All of the above

Explanation:

The things TRUE about ETL Pipelines are -

  1. In addition to enterprise data warehouses, subject-specific data marts are also built using ETL pipelines.
  2. As a data migration solution, ETL pipelines are also used when replacing traditional applications with new ones.
  3. Industry-standard ETL tools are usually used to construct ETL pipelines that transform structured data.

Discuss this Question


115. Using data pipelines, one can ____, create real-time data streaming applications, conduct data mining, and build data-driven digital products.

  1. Integrate data across applications
  2. Build data-driver web products
  3. Build predictive models
  4. All of the above

Answer: D) All of the above

Explanation:

Using data pipelines, one can integrate data across applications, build data-driven web products, build predictive models, create real-time data streaming applications, conduct data mining, and build data-driven digital products.

Discuss this Question


116. Logs of ETL contain information about disk access, ____, and the Microsoft Operating System's performance. They also contain the event of high-frequency events.

  1. Page Initials
  2. Page faults
  3. Pagination
  4. Page rows

Answer: B) Page faults

Explanation:

Logs of ETL contain information about disk access, page faults, and the Microsoft Operating System's performance. They also contain the event of high-frequency events.

Discuss this Question


117. ____ files are also used by the Eclipse Open Development Platform.

  1. .psd
  2. .etl
  3. .pdf
  4. .png

Answer: B) .etl

Explanation:

.etl files are also used by the Eclipse Open Development Platform.

Discuss this Question


118. What is/are TRUE about Trace Logs?

  1. By default, trace providers generate trace logs in their trace session buffers, which are stored by the operating system.
  2. A compressed binary format is then used to store trace logs in a log.
  3. Both A and B
  4. None of the above

Answer: C) Both A and B

Explanation:

The things TRUE about Trace Logs are -

  1. By default, trace providers generate trace logs in their trace session buffers, which are stored by the operating system.
  2. A compressed binary format is then used to store trace logs in a log.

Discuss this Question


119. A product with an ETL ____ Mark has been independently tested to meet the applicable standard.

  1. Data
  2. Listed
  3. Stamp
  4. None

Answer: B) Listed

Explanation:

A product with an ETL Listed Mark has been independently tested to meet the applicable standard.

Discuss this Question


120. In order to maintain the certification for products with ETL listed marks, regular product and site inspections are conducted to ensure that the product is manufactured and matches the ____ product.

  1. Corrupted
  2. Original
  3. Copy
  4. Artificial

Answer: B) Original

Explanation:

In order to maintain the certification for products with ETL-listed marks, regular product and site inspections are conducted to ensure that the product is manufactured and matches the original product.

Discuss this Question


121. Operation(s) that is/are performed in Database Testing?

  1. Validating the values of columns in a table is the focus of database testing.
  2. Database testing is used to ensure the foreign key or primary key is maintained.
  3. During database testing, it is verified whether there is any missing data in a column.
  4. All of the above

Answer: D) All of the above

Explanation:

Operations that are performed in Database Testing -

  1. Validating the values of columns in a table is the focus of database testing.
  2. Database testing is used to ensure the foreign key or primary key is maintained.
  3. During database testing, it is verified whether there is any missing data in a column.

Discuss this Question


122. Which of the following tasks is NOT involved in ETL Transformation Process?

  1. Filtering
  2. Cleaning
  3. Joining
  4. Addressing

Answer: D) Addressing

Explanation:

Addressing is NOT any task involved in ETL Transformation Process.

Discuss this Question


123. The following point(s) is/are involved in building streaming ETL based on Kafka:

  1. Extracting data into Kafka
  2. Pulling data from Kafka
  3. Load data to other systems
  4. All of the above

Answer: D) All of the above

Explanation:

The following points are involved in building streaming ETL based on Kafka:

  1. Extracting data into Kafka
  2. Pulling data from Kafka
  3. Load data to other systems

Discuss this Question


124. The following operation(s) is/are performed in Applying Transformation Logic-

  1. Before and after checking the count record, transformation logic is applied.
  2. The intermediate table must be validated as the data flows from the staging area.
  3. Make sure the thresholds for the data are valid
  4. All of the above

Answer: D) All of the above

Explanation:

Following operations are performed in Applying Transformation Logic-

  1. Before and after checking the count record, transformation logic is applied.
  2. The intermediate table must be validated as the data flows from the staging area.
  3. Make sure the thresholds for the data are valid

Discuss this Question


125. ____ the data is loaded, transformation logic is applied.

  1. Before
  2. After
  3. While
  4. None

Answer: A) Before

Explanation:

Before the data is loaded, transformation logic is applied.

Discuss this Question





Comments and Discussions!

Load comments ↻






Copyright © 2024 www.includehelp.com. All rights reserved.