CS614-Midterm
1 / 50
Execution can be completed successfully or it may be stopped due to some error. In case of successful completion of execution all the transactions will be ___________
2 / 50
_____________ is a process which involves gathering of information about column through execution of certain queries with intention to identify erroneous records.
3 / 50
A single database, couldn‟t serve both operational high performance transaction processing and DSS, analytical processing, all at the same time.
4 / 50
The goal of ideal parallel execution is to completely parallelize those parts of a computation that are not constrained by data dependencies. The __________ the portion of the program that must be executed sequentially, the greater the scalability of computation.
5 / 50
The growth of master files and magnetic tapes exploded around the mid- _______. :
6 / 50
The _________ is only a small part in realizing the true business value buried within the mountain of data collected and stored within organizations business systems and operational databases.
7 / 50
Data mining derives its name from the similarities between searching for valuable business information in a large database, for example, finding linked products in gigabytes of store scanner data, and mining a mountain for a _________ of valuable ore.
8 / 50
Change Data Capture is one of the challenging technical issues in _____________
9 / 50
In a traditional MIS system, there is an almost linear sequence of queries.
10 / 50
In horizontal splitting, we split a relation into multiple tables on the basis of
11 / 50
DOLAP allows download of “cube” structures to a desktop platform with the need for shared relational or cube server.
12 / 50
Pakistan is one of the five major ________ countries in the world.
13 / 50
It is observed that every year the amount of data recorded in an organization :
14 / 50
DTS allows us to connect through any data source or destination that is supported by ____________
15 / 50
NUMA stands for __________
16 / 50
17 / 50
For good decision making, data should be integrated across the organization to cross the LoB (Line of Business). This is to give the total view of organization from:
18 / 50
Data mining is a/an ______ approach , where browsing through data using mining techniques may reveal something that might be of interest to the user as information that was unknown previously.
19 / 50
Analytical processing uses ____________ , instead of record level access.
20 / 50
For a DWH project, the key requirement are ________ and product experience.
21 / 50
The degree of similarity between two records, often measured by a numerical value between _______, usually depends on application characteristics.
22 / 50
The automated, prospective analyses offered by data mining move beyond the analyses of past events provided by _____________ tools typical of decision support systems.
23 / 50
People that design and build the data warehouse must be capable of working across the organization at all levels
24 / 50
Data mining uses _________ algorithms to discover patterns and regularities in data.
25 / 50
_____modeling technique is more appropriate for data warehouses.
26 / 50
____________ in agriculture extension is that pest population beyond which the benefit of spraying outweighs its cost.
27 / 50
The users of data warehouse are knowledge workers in other words they are _______in the organization.
28 / 50
During ETL process of an organization, suppose you have data which can be transformed using any of the transformation method. Which of the following strategy will be your choice for least complexity?
29 / 50
Pre-computed _______ can solve performance problems
30 / 50
If every key in the data file is represented in the index file then index is :
31 / 50
_______________, if fits into memory, costs only one disk I/O access to locate a record by given key.
32 / 50
Virtual cube is used to query two similar cubes by creating a third “virtual” cube by a join between two cubes.
33 / 50
The STAR schema used for data design is a __________ consisting of fact and dimension tables. :
34 / 50
Pre-join technique is used to avoid
35 / 50
Focusing on data warehouse delivery only often end up _________.
36 / 50
37 / 50
38 / 50
The Kimball s iterative data warehouse development approach drew on decades of experience to develop the _________.
39 / 50
_________ breaks a table into multiple tables based upon common column values.
40 / 50
A ________ dimension is a collection of random transactional codes, flags and/text attributes that are unrelated to any particular dimension. The ______ dimension is simply a structure that provides a convenient place to store the ______ attributes.
41 / 50
42 / 50
Slice and Dice is changing the view of the data.
43 / 50
If some error occurs, execution will be terminated abnormally and all transactions will be rolled back. In this case when we will access the database we will find it in the state that was before the ____________.
44 / 50
The performance in a MOLAP cube comes from the O(1) look-up time for the array data structure.
45 / 50
________ is the technique in which existing heterogeneous segments are reshuffled, relocated into homogeneous segments.
46 / 50
Taken jointly, the extract programs or naturally evolving systems formed a spider web, also known as
47 / 50
Data mining evolve as mechanism to cater the limitations of _____ systems to deal massive data sets with high dimensionality, new data types, multiple heterogeneous data resources etc...
48 / 50
Grain is the ________ level of data stored in the warehouse.
49 / 50
The purpose of the House of Quality technique is to reduce ______ types of risk.
50 / 50
When performing objective assessments, companies follow a set of principles to develop metrics specific to their needs, there is hard to have “one size fits all” approach. Which of the following statement represents the pervasive functional forms?
Your score is
The average score is 0%
Restart quiz