Upgrade to remove ads
CISA Domain 3
Terms in this set (95)
Integrated Test Facility (ITF)
- Fictitious entity is created in LIVE environment
- This technique allows auditor to open a dummy account
- Auditor can enter dummy or test transactions and verify the processing and results of these transactions for correctness
- Processed results and expected results are compared to verify that systems are operating correctly
- Example: A dummy asset of $100,000/- is entered into system to veridy whether same is being capitalized under correct head and depreciation is calculated properly as per correct rate. Subsequently this dummy transaction is removed after verification of system controls
System Control Audit Review File (SCARF)
- In this technique an embedded (inbuilt) audit module is used to continuously monitor transactions
- This technique is used to collect data for special audit purposes
- files records only those transactions which are of special audit significance such transactions above specified limit or transactions related to deviation / exception
- On a regular basis, auditor gets a printout of the file for examination and verification
- In this technique, snaps (pictures) are taken of the transactions as transaction moves through various stages in the application system
- Both before-processing and after-processing images of the transactions are captured. Auditor can verify the correctness of the processing by checking before processing and after processing images of the transactions
- In this technique, three important considerations are (i) location where snaps to be taken (ii) time of capturing snaps and (iii) reporting of snapshot data captured
- These are audit software that captures suspicious transactions
- Criteria for suspicious transactions are designed by auditors as per their requirement
- For example, in most of the organiztions, cash transactions are monitored closely. Criteria can be designed to capture cash transaction exceeding $ 50,000/-All such captured transactions are subsequently verified by auditor to identify fraud, if any
- useful when early detection of error or fraud is required
Continuous and Intermittent Simulation (CIS)
- This technique is variation of SCARF technique
- Can be used whenever the application system uses the database management system (DBMS)
- DBMS reds the transaction which is passed to CIS. If transaction is as per selected criteria, then CIS examines the transaction for correctness
- determines whether any discrepancies exist between the results it produces and those the application system produces
- Such discrepencies are written to exception log file
- Thus, CIS replicates or simulates the application system processing
- As high complex crieria can be set in CIS, it is the best technique to identify transactions as per pre-defined criteria
SDLC - Unit Testing
- Testing is done by developer as and when individual program or module is ready. No need to wait til completion of full software
- White box approach (i.e. testing of internal program logic) is applied in unit testing
SDLC - Integrated Testing/Interface Testing
Integrated testing involves testing of connection of two or more module or componenets that pass information from one area to another
SDLC - System Testing
The primary reason for system testing is to evaluate the entire system functionality. System testing includes (i) Recovery testing (ii) Security Testing (iii) Load testing (iv) Volume testing (v) Stress tresting & (vi) Performance testing
Final Acceptance Testing
Includes (i) Quality Assurance Testing (QAT) & (ii) User Acceptance Testing (UAT)
- Testing done again to ensure that changes or corrections in a program have not introduced new errors
- Data used for this testing should be same data as used in previous test
- ensures that changes or corrections in a program have not introduced new errors. Therefore, this would be achieved only if the data used for regression testing are the same as the data used in previous tests
- A test to ensure that new or modified system can work in the specified environment without adversely impacting existing system
- Takes place first at a location to review the performance. The purpose is to see if the new system operates satisfactorily in one place before implementing it at other locations
- The process of comparing results of the old and new system. The purpose of this testing is to ensure that the implementation of a new system meets user requirements
White Box Testing vs Black Box Testing
- Program logic is tested
- Applicable for unit testing and interference testing
- Detailed knowledge of programming is required
- Only functionality is tested, program logic is not tested
- Applicable for user acceptance testing (UAT) and interface testing
- Testing can be performed without knowledge of programming
Alpha Testing vs Beta Testing
- Testing done by internal user
- doen prior to beta testing
- may not involve testing of full functionality
- Testing doen by external user
- done after alpha testing
- generally, beta testing involves testing of full functionality
Top Down Approach vs Bottom up Approach
-Opposite of bottom-up approach. Test starts from broader level and then gradually moves towards individual programs and modules
- Advantages: (i) interface error can be detected earlier (ii) confidence in the system is achieved earlier
- More appropriate for prototype development
- Begin testing of individual units such as programs or modules and work upward until a complete system is tested
- Advantages: (i) Test can be started even before all programs are complete (ii) Errors in critical modules can be found early
Regression Testing vs Sociability Testing
- Test to ensure that corrections or changes done have not introduced new erros
- Test to ensure that new or modified system can work without adversely impacting exisiting system
Unit Testing vs Interface Testing
- Involves testing of individual program or module
- Involves testing of connection of two or more componenets that pass information from one are to another
-A mathematically calculated value that is added to data to ensure that the original data have not been altered
- This helps in avoiding transposition and transcription errors
- Says whether the number of 1 bits is odd or even. Generally the parity bit is 1 if the number of 1 bits is odd and 0 if the sum of the 1 bits is even
- Verified by receiving computer to ensure data completeness and data integrity during transmission
- Used to check for completeness of data transmissions. It is a hardware control that detects data errors when data are read from one computer to another, from memory or during transmission
- Exactly the same as parity but able to identify complex errors by increasing the complexity of the arithmetic
Forward Error Control
- Works on the same principle as CRC. However, it also corrects the error. It provides the receiver with the ability to correct errors
- A feature of database systems where a transaction must be all-or-nothing. That is, the transaction must either fully happen, or not happens at all. The principle of atomicity requires that a transaction be completed in its entirety or not at all. If an error or interruption occurs, all changes made upto that points are blacked out.
Critical Path Methodology (CPM)
- A technique for estimating project duration. All projects have at least one critical path
- Critical path is sequence of activities where duration is longest as compared other path
- Thus, CPM represents the shortest possible time required for completing the project.
- Activities on Critical Path have zero slack time
- Alternatively, it can be said that activies with zero slack time are on a critical path
- Slack time can be defined as the amount of time an activity can be delayed without impacting the completion date of the project. Thus zero slack time makes an activity critical and concentration on such activities will help to reduce overall project completion time.
Program Evaluation Review Technique (PERT)
- A technique for estimating project duration
- Advantage of PERT over CPM is that in CPM only single duration is considered while PERT considers three different scenerios i.e. optimistic (best), pessimistic (worst) and normal (most likely) and on the basis of three scenarios, a single critical path is arrived
- PERT is more reliable than CPM for estimating project duration
- Progress of the entire project can be read from this to determine whether the project is behind, ahead or on schedule compared to baseline project plan
- Can also be used to track the achievement of milestone
Function Point Analysis (FPA)
- Indirect method of software size estimation
- Function points are a unit measure for software size much like an hour is to measuring time, miles are to measuring distance or Celsius is to measuring temperature
- FPA is arrived on the basis of number and complexity of inputs, outputs, files, interfaces and queries
- FPA is more reliable than SLOC
Counting source lines of code (SLOC)
- SLOC is a direct method of software size estimation
Earned Value Analysis (EVA)
- EVA compares following metrics at regular interval:
- Budget to date: (i) Actual spending to date (ii) estimate to complete (iii) estimate at completion
- It compares the planned amount of work with what has actually been completed to determine if the cost, schedule and work accomplished are progessing in accordance with the plan
- EVA is based on the premise that if a project task is assigned 24 hours for completion, it can be reasonably completed during that time frame. For example, a development team has spent eight hours of activity on the first day against a budget of 24 hours (over three days). The projected time to complete the remainder of the activity is 20 hours.
- Thus value of actual work completed indicated dealy of 4 hours from schedule
- Major advantage of this approach is that it prevents project cost overruns and delays from scheduled delivery
- It is used for prototyping or rapid application development where project need to be completed within timeframe
- It integrates system and user acceptance testing, but does not eliminate the need for a quality process
What is Decision Support System (DSS)?
- An interactive system which support semi-structured decision making. It collects data from varied sources and provides useful information to managers.
- Example of information that it provides:
- Comparative sales figures between one week and the next
- Projected revenue figures based on various assumptions
- Evaluation of various alternatives on the basis of past experience
Characterisitcs of DSS
- Handles unstructued problems. Supportd semi-structured or less structued decisions
- Flexible and adoptable in changing environemnt and decision making approach of the users
DSS Efficiency vs Effectiveness
- A principle fo DSS design is to concentrate less on efficiency (i.e. perfomring tasks quickly and reducing the costs) and more on effectiveness (i.e perfomring the right task)
DSS Design & Development
- Prototyping is the most popular approach to DSS design and development
DSS Implementation Risks (7)
- Non-existent or unwilling users
- Multiple users or implementers
- Disappearing users, implementers and maintainers
- Inability to specify purpose or usage patterns in advance
- inability to predict and cushion impact on all parties
- Lack of loss of support
- Lack of experience with similar systems
- Technical problems and cost effectiveness issues
- allows the programmer to just start writing a program without spending much time on preplanning documentation
- less importance is placed on formal paper-based deliverables, with the preference being to produce releasable software in short iterations, typically ranging from 4 to 8 weeks
- At the end of each iterations, the team considers and documents what worked well adn what could have worked better and identifies improvements to be implemented in subsequent iterations
- The process of creating systems through controlled trial and error
- A protype is an early sample or model to test a concept or process. A prototype is a small scale working system used to test the assumptions. Assumptions may be about user requirements, program design or internal logic
- can provide the organization with signicant time and cost savings
- By focusing mainly on wht the user wants and sees, the developers may miss some of the controls that come from the traditional systems development approach; therefore, a potential risk is that the finished system will have poor controls
Rapid Application Development
Includes use of:
- small and well trianed development teams
- Tools to support modelling, prototyping and componenet reusability
- Central repository
- Rigid limits on development time frames
Enables the orgnaization to develop systems quickly while reducing developmetn cost and maintaining quality.
Relies on the usage of a prototype that can be updated continually to meet changing user or business requirements
Object Oriented System Development
- OOSD is a programming technique and not a software development methodology
- In Object oriented language, application is make up of smaller componenets (objects)
- One major benefit is the ability to reuse objects
- OO uses a technique called ""encapsulations"" in which one object interacts with another object. This is a common practice whereby any particular object may call other objects to perform its work
Component Based Development
- Can be regarded as an outgrowth of object-oriented development
- The process of updating an existing system by extracting and reusing design and program components
- This process is used to support major changes in the way an organization operates"
- The process of studying and analyzing an application and the information us used to develop a similar system
This evaluates the performance of the software under normal and peak conditions
This determines the capacity of the software to cope with an abnormal number of users or simultaneous operations
This evaluates the ability of a system to recover after a failure
This evaluates the impact of incremental volume of records (not users) on a system
This identifies specific program logic that has not been tested and analyzes programs during execution to indicate whether program statements have been executed
Refers to the continuity in serial numbers within the number range on documents
Principle that data integrity is maintained by ensuring that a transaction is either completed in its entirety or not at all
"- Relational integrity testing used to detect modification to sensitive data
- Used to ensure that batch data is completely and accurately transferred between 2 systems"
Detects transmission errors by appending calculated bits onto the end of each segment of data
"- Processing control that ensures the completeness and accuracy of accumulated data
- Provide the ability to verify data values through the stages of application processing"
"- Used to check for completeness of data transmissions
- Used within network to make sure transmission reached destination in entirety"
Numeric value that has been calculated mathematically and added to data to ensure that the original data have not been altered or an incorrect, but valid, value substituted.
Occurs when one employee enters the amount field and another employee reenters the same data again
Used to determine if a field contains data and not zeroes or blanks
- Consistency - ensures that all integrity conditions the database be maintained with each transaction
- Isolation - ensures that each transaction is isolated from other transactions, and hence each transaction only accesses data that part of a consistent database state
- Durability - ensures that when a transaction has been reported back to a user as complete, the resulting changes to the database will survive subsequent hardware or software failures"
Quality Assurance Testing (QAT)
Focuses on technical aspects of the system and verifies the system works as per the documented specifications
White Box Testing
"- Examines a program's internal logic structure
- Used for unit and integration testing"
Black Box Testing
"- a dynamic analysis tool for testing software modules
- tests functionality without regard to actual internal program structure
- used for integration and user acceptance testing
- Ensures that expected results are produced from defined inputs"
"- Improves the performance of the web server and network in an Internet application
- applets transfer some of the processing load to the client, thereby reducing load on the server"
Process of rerunning tests to ensure that modifications have not introduced new errors
User Acceptance Testing
"- aka Final Acceptance Testing
- Failure of UAT results in the greatest impact to the organization in terms of delays and cost overruns. As compared to failing other systems development testing
- occurs during SDLC implementation phase"
Quality Assurance group is primarily responsible for:
ensuring that programs and program changes and documentation adhere to established standards
Common Gateway Interface (CGI)
Most often used as a consistent way for transferring data to the application program and back to the user
Advantages to using top-down testing:
"- Tests of major functions and processing are conducted early
- interface errors are detected sooner
- user and programmer confidence in the system is raised
- is it most effective during the initial phases or prototyping"
Series of tests to ensure that all components work properly together, and evaluates the system functionality
Waterfall Life Cycle Model
- Most appropriately used when requirements are well understood and are expected to remain stable, as is the business environment in which the system will operate.
Most common reason for the failure of information systems to meet the needs of the users is that:
User participation in defining the system's requirements was inadequate.
Advantages to using bottom-up testing
"- Can be started before all programs are complete
- errors in critical modules are found early"
SDLC Planning Phase
"- Phases and deliverables of a SDLC project should be determined
- 6 Phases total"
SDLC Phase 1
- development of business case
- return on investments"
SDLC Phase 2
- identify and specify business requirements for the system
- early engagement of key users helps ensure business requirements are met during software development
- UAT plans normally are prepared in this phase
- IS auditors should ensure that security requirements of a new applications development project are defined in the requirements phase"
SDLC Phase 3
- program and database specifications
- security considerations
- procedures to prevent scope creep should be baselines in the Design phase of the SDLC
- system flowcharts and entity relationships are developed
- input/output definitios are developed (screen designs, reports)
- data file or database system is determined
- Software Baselining"
SDLC Phase 4
"- Development Testing
- addressed primarily by application programs and system analysts
- applications and systems testing is performed at this phase"
SDLC Phase 5
- UAT performed
- QAR - focuses on technical aspects fo the system. Verified the sysem works as per for the documented specificaitons"
SDLC Phase 6
"- Post Implementation
- should be performed jointly with project mgmt and appropriate end users
- IA auditors involved should not perform implementation review
- primary objective- assess whether expected project benefits were received"
Can be used to verify output results and control totals by matching them against the input data and control totals
Rapid Applications Development
"- allows quicker development of strategially important systems
- uses rigid time-frame limits through timebox management
- integrates system and user acceptance testing
- reduces costs while maintainung quality by incorporating protyping"
Component Bases Development
"Assembling applications from cooperating packages of executable software that ineract the well-defined and controlled interfaces. It is characterized by:
- Significant for web-based applications
- improved quality
- allows developers to function more strongly on business functionality
- object oriented design and development techniques facilitate the ability to reuse modules
- reduces development cost"
Prototyping Development Method
"-process of creating a system through controlled trial and error procedures to reduce risk in developing system
- reduces the time to deploy systems by using faster development tools such as 4-GL programming languages
- potential risk is the the finished system will have poor controls
- provides significant time and cost savings
- screens, interactive edits and sample reports are typical prototypes of an interactive application"
Integrated Development Environment (IDE)
"- Provides the benefit of expanding the programming resources and aids available
- Utilizes an on-line programming facility to allow programmers to create programs interactively in an integrated development environment
- program libraries reside on the server
- provides for faster develppment and the use of standard and structured programming techniques"
Enhances an existing system by extracting and reusing design and program components
Evolutionary (Heuristic) Development
- uses Prototyping to develop specifications
Service Oriented Archtiecture
- design pattern based on distinct pieces of software proving an application functionality as services togother applications via a potocol. It is independent of any vendor, product, or technology
- 4thgeneration languages high level computer languages provide fast iteration through successive designs
Integrated Test Facility (ITF)
"- periodic testing does not require separate test processes- ITF creates a fictitipus entity in the database to process test transactions simultaneously with live input.
- Test data must be isolated from production data"
During the development of an application, quality assurance testing and user acceptance testing were combined. The MAJOR concern for an IS auditor reviewing the project is that there will be:
Improper acceptance of a program. he major risk of combining quality assurance testing and user acceptance testing is that the users may apply pressure to accept a program that meets their needs even though it does not meet quality assurance standards.
An IS auditor is involved in the reengineering process that aims to optimize IT infrastructure. Which of the following will BEST identify the issues to be resolved?
"- Gap Analysis
- Gap Analysis indicates which parts of the current processes confrim to good practices (desired state) and which do not."
When implementing an application software package, which of the following presents the GREATEST risk?
Incorrectly set parameters
Earned Value Analysis
"- Method for measuring a project's progress at any given point in time, forecasting its completion date and final cost, and analyzing variances in the schedule and budget as the project proceeds.
- It compares the planned amount of work with what has actually been completed to determine if the cost, schedule and work accomplished are progressing in accordance with the plan.
- Works most effectively if a well-formed work breakdown structure exists."
The MAJOR advantage of a component-based development approach is the:
support of multiple development environments.
These help to identify activities that have been completed early or late through comparison to a baseline. Progress of the entire project can be read from the Gantt chart to determine whether the project is behind, ahead of or on schedule.
-Provides a cutoff point for the design of the system and allows the project to proceed as scheduled wihtout being delayed by scope creep.
THIS SET IS OFTEN IN FOLDERS WITH...
CISA Domain 1
CISA Domain 2
CISA Domain 4
CISA Domain 5
YOU MIGHT ALSO LIKE...
CISA Questions (301 - 400)
System Analysis and Design chapter 11
Chapter 3 Information Systems Acquisition, Develop…
OTHER SETS BY THIS CREATOR
CISA Domain 5
CISM Domain 2
CISM Essentials Section 1
OTHER QUIZLET SETS
Bus 395 - Software Testing
Exam 2: Strength & conditioning Chapter 12
M&E chapter 1 and 3
CEN5035 - Handout 6