CS614-Midterm
1 / 50
_____________ is a process which involves gathering of information about column through execution of certain queries with intention to identify erroneous records.
2 / 50
Multidimensional databases typically use proprietary __________ format to store pre-summarized cube structures.
3 / 50
Data mining uses _________ algorithms to discover patterns and regularities in data.
4 / 50
The automated, prospective analyses offered by data mining move beyond the analyses of past events provided by _____________ tools typical of decision support systems.
5 / 50
_________ breaks a table into multiple tables based upon common column values.
6 / 50
For a given data set, to get a global view in un-supervised learning we use
7 / 50
In horizontal splitting, we split a relation into multiple tables on the basis of
8 / 50
De-Normalization normally speeds up
9 / 50
DOLAP allows download of “cube” structures to a desktop platform with the need for shared relational or cube server.
10 / 50
The need to synchronize data upon update is called
11 / 50
If every key in the data file is represented in the index file then index is :
12 / 50
Change Data Capture is one of the challenging technical issues in _____________
13 / 50
Pre-computed _______ can solve performance problems
14 / 50
_______________, if too big and does not fit into memory, will be expensive when used to find a record by given key.
15 / 50
: An optimized structure which is built primarily for retrieval, with update being only a secondary consideration is
16 / 50
5 million bales.
17 / 50
18 / 50
DSS queries do not involve a primary key
19 / 50
Pipeline parallelism focuses on increasing throughput of task execution, NOT on __________ sub-task execution time.
20 / 50
In _________ system, the contents change with time. :
21 / 50
.______ is class of Decision Support Environment.
22 / 50
Transactional fact tables do not have records for events that do not occur. These are called
23 / 50
To measure or quantify the similarity or dissimilarity, different techniques are available. Which of the following option represent the name of available techniques?
24 / 50
Data mining evolve as a mechanism to cater the limitations of ________ systems to deal massive data sets with high dimensionality, new data types, multiple heterogeneous data resources etc.
25 / 50
Grain is the ________ level of data stored in the warehouse.
26 / 50
The goal of star schema design is to simplify ________
27 / 50
For a DWH project, the key requirement are ________ and product experience.
28 / 50
The Kimball s iterative data warehouse development approach drew on decades of experience to develop the _________.
29 / 50
During ETL process of an organization, suppose you have data which can be transformed using any of the transformation method. Which of the following strategy will be your choice for least complexity?
30 / 50
A single database, couldn‟t serve both operational high performance transaction processing and DSS, analytical processing, all at the same time.
31 / 50
Companies collect and record their own operational data, but at the same time they also use reference data obtained from _______ sources such as codes, prices etc.
32 / 50
Investing years in architecture and forgetting the primary purpose of solving business problems, results in inefficient application. This is the example of _________ mistake.
33 / 50
Pakistan is one of the five major ________ countries in the world.
34 / 50
As apposed to the out come of classification, estimation deal with ____________ valued outcome.
35 / 50
Data mining is a/an __________ approach, where browsing through data using data mining techniques may reveal something that might be of interest to the user as information that was unknown previously.
36 / 50
Ad-hoc access means to run such queries which are known already.
37 / 50
In DWH project, it is assured that ___________ environment is similar to the production environment
38 / 50
39 / 50
The degree of similarity between two records, often measured by a numerical value between _______, usually depends on application characteristics.
40 / 50
If some error occurs, execution will be terminated abnormally and all transactions will be rolled back. In this case when we will access the database we will find it in the state that was before the ____________.
41 / 50
Collapsing tables can be done on the ___________ relationships
42 / 50
The automated, prospective analyses offered by data mining move beyond the analysis of past events provided by respective tools typical of ___________.
43 / 50
_____modeling technique is more appropriate for data warehouses.
44 / 50
The growth of master files and magnetic tapes exploded around the mid- _______. :
45 / 50
46 / 50
People that design and build the data warehouse must be capable of working across the organization at all levels
47 / 50
Naturally Evolving architecture occurred when an organization had a _______ approach to handling the whole process of hardware and software architecture.
48 / 50
The goal of ______is to look at as few block as possible to find the matching records.
49 / 50
Normalization effects performance
50 / 50
Your score is
The average score is 0%
Restart quiz