Which of the following is least likely to be performed efficiently using data analytics?

Purchase the CPA Canada Guide to Audit Data Analytics, which is designed to assist and encourage practitioners to make use of technology-based data analytics in the audit of financial statements.

Which of the following is least likely to be performed efficiently using data analytics?

Ebook |  Item # 76154001 |  Member $135, Non Member $145

Prices may change without notice.

Purchase Now

In response to a changing environment, increased use of audit data analytics (ADAs) is important in maintaining the relevance of the financial statement audit and in helping to improve its effectiveness and efficiency. These data analytics are audit procedures used to discover and analyze patterns, identify anomalies, and obtain other useful information from data populations relevant to the audit. Because of the power of advanced analytical tools, auditors using ADAs are able to perform procedures that may, in some cases, cover 100 per cent of the items in large data populations.

The main objective of auditors who make use of ADAs is to improve audit quality. For example, some data analytics may allow the auditor to use more complex data models to increase the rigour of audit procedures. Their use may improve the auditor’s ability to assess the risks of material misstatement in various accounts through analysis of up to 100 per cent of relevant data for those accounts. The use of ADAs also provides deeper insights into an organization’s systems and controls. A process-mining example of ADAs may be effective in identifying controls deficiencies. Such insights may be a valuable by-product of the financial statement audit with the potential to improve communications with management and those charged with governance. Additionally, the use of ADA may enable many routine aspects of the audit to be automated, allowing auditors to devote more time to challenging matters requiring significant use of professional judgement.

This non-authoritative guide has been adopted by CPA Canada from the AICPA audit guide of the same title. It was developed by a team of experts who have hands-on experience working with ADAs, which can be used to perform a variety of procedures to gather audit evidence.

This guide will:

  • make auditors aware of how ADAs may be efficiently and effectively used in each phase of the financial statement audit, performed in accordance with generally accepted Canadian Auditing Standards (CAS)
  • help auditors identify and address matters that they may encounter in deciding whether to use ADAs and, if so, how
  • assist auditors in applying ADAs in performing audit engagements

The use of ADAs has the potential to enhance traditional audit procedures, contribute to every phase of the audit and offer a new way of visualizing and analyzing results.

Highlights

Public sector conference 2022

November 1, 2022

If you work for a public sector body in accounting, finance or leadership, this is the event for you. Learn how to successfully navigate the unique challenges facing every level of government during unprecedented times.

Money and the World Virtual Conference 2022

November 2, 2022

CPA Canada’s Financial Literacy Program examines global financial subjects, trends, and issues in this unique virtual conference centring on both personal finance and small and medium businesses.

AICPA women’s global leadership summit 2022

November 8, 2022

Come and join hundreds of accounting and finance professionals who are committed to creating an inclusive and equal profession, just like you.

What is big data analytics?

Big data analytics is the often complex process of examining big data to uncover information -- such as hidden patterns, correlations, market trends and customer preferences -- that can help organizations make informed business decisions.

On a broad scale, data analytics technologies and techniques give organizations a way to analyze data sets and gather new information. Business intelligence (BI) queries answer basic questions about business operations and performance.

Big data analytics is a form of advanced analytics, which involve complex applications with elements such as predictive models, statistical algorithms and what-if analysis powered by analytics systems.

Why is big data analytics important?

Organizations can use big data analytics systems and software to make data-driven decisions that can improve business-related outcomes. The benefits may include more effective marketing, new revenue opportunities, customer personalization and improved operational efficiency. With an effective strategy, these benefits can provide competitive advantages over rivals.

How does big data analytics work?

Data analysts, data scientists, predictive modelers, statisticians and other analytics professionals collect, process, clean and analyze growing volumes of structured transaction data as well as other forms of data not used by conventional BI and analytics programs.

Here is an overview of the four steps of the big data analytics process:

  1. Data professionals collect data from a variety of different sources. Often, it is a mix of semistructured and unstructured data. While each organization will use different data streams, some common sources include:
  • internet clickstream data;
  • web server logs;
  • cloud applications;
  • mobile applications;
  • social media content;
  • text from customer emails and survey responses;
  • mobile phone records; and
  • machine data captured by sensors connected to the internet of things (IoT).
  1. Data is prepared and processed. After data is collected and stored in a data warehouse or data lake, data professionals must organize, configure and partition the data properly for analytical queries. Thorough data preparation and processing makes for higher performance from analytical queries.
  2. Data is cleansed to improve its quality. Data professionals scrub the data using scripting tools or data quality software. They look for any errors or inconsistencies, such as duplications or formatting mistakes, and organize and tidy up the data.
  3. The collected, processed and cleaned data is analyzed with analytics software. This includes tools for:
  • data mining, which sifts through data sets in search of patterns and relationships
  • predictive analytics, which builds models to forecast customer behavior and other future actions, scenarios and trends
  • machine learning, which taps various algorithms to analyze large data sets
  • deep learning, which is a more advanced offshoot of machine learning
  • text mining and statistical analysis software
  • artificial intelligence (AI)
  • mainstream business intelligence software
  • data visualization tools

Key big data analytics technologies and tools

Many different types of tools and technologies are used to support big data analytics processes. Common technologies and tools used to enable big data analytics processes include:

  • Hadoop, which is an open source framework for storing and processing big data sets. Hadoop can handle large amounts of structured and unstructured data.
  • Predictive analytics hardware and software, which process large amounts of complex data, and use machine learning and statistical algorithms to make predictions about future event outcomes. Organizations use predictive analytics tools for fraud detection, marketing, risk assessment and operations.
  • Stream analytics tools, which are used to filter, aggregate and analyze big data that may be stored in many different formats or platforms.
  • Distributed storage data, which is replicated, generally on a non-relational database. This can be as a measure against independent node failures, lost or corrupted big data, or to provide low-latency access.
  • NoSQL databases, which are non-relational data management systems that are useful when working with large sets of distributed data. They do not require a fixed schema, which makes them ideal for raw and unstructured data.
  • A data lake is a large storage repository that holds native-format raw data until it is needed. Data lakes use a flat architecture.
  • A data warehouse, which is a repository that stores large amounts of data collected by different sources. Data warehouses typically store data using predefined schemas.
  • Knowledge discovery/big data mining tools, which enable businesses to mine large amounts of structured and unstructured big data.
  • In-memory data fabric, which distributes large amounts of data across system memory resources. This helps provide low latency for data access and processing.
  • Data virtualization, which enables data access without technical restrictions.
  • Data integration software, which enables big data to be streamlined across different platforms, including Apache, Hadoop, MongoDB and Amazon EMR.
  • Data quality software, which cleanses and enriches large data sets.
  • Data preprocessing software, which prepares data for further analysis. Data is formatted and unstructured data is cleansed.
  • Spark, which is an open source cluster computing framework used for batch and stream data processing.

Big data analytics applications often include data from both internal systems and external sources, such as weather data or demographic data on consumers compiled by third-party information services providers. In addition, streaming analytics applications are becoming common in big data environments as users look to perform real-time analytics on data fed into Hadoop systems through stream processing engines, such as Spark, Flink and Storm.

Early big data systems were mostly deployed on premises, particularly in large organizations that collected, organized and analyzed massive amounts of data. But cloud platform vendors, such as Amazon Web Services (AWS), Google and Microsoft, have made it easier to set up and manage Hadoop clusters in the cloud. The same goes for Hadoop suppliers such as Cloudera, which supports the distribution of the big data framework on the AWS, Google and Microsoft Azure clouds. Users can now spin up clusters in the cloud, run them for as long as they need and then take them offline with usage-based pricing that doesn't require ongoing software licenses.

Big data has become increasingly beneficial in supply chain analytics. Big supply chain analytics utilizes big data and quantitative methods to enhance decision-making processes across the supply chain. Specifically, big supply chain analytics expands data sets for increased analysis that goes beyond the traditional internal data found on enterprise resource planning (ERP) and supply chain management (SCM) systems. Also, big supply chain analytics implements highly effective statistical methods on new and existing data sources.

Which of the following is least likely to be performed efficiently using data analytics?
Big data analytics is a form of advanced analytics, which has marked differences compared to traditional BI.

Big data analytics uses and examples

Here are some examples of how big data analytics can be used to help organizations:

  • Customer acquisition and retention. Consumer data can help the marketing efforts of companies, which can act on trends to increase customer satisfaction. For example, personalization engines for Amazon, Netflix and Spotify can provide improved customer experiences and create customer loyalty.
  • Targeted ads. Personalization data from sources such as past purchases, interaction patterns and product page viewing histories can help generate compelling targeted ad campaigns for users on the individual level and on a larger scale.
  • Product development. Big data analytics can provide insights to inform about product viability, development decisions, progress measurement and steer improvements in the direction of what fits a business' customers.
  • Price optimization. Retailers may opt for pricing models that use and model data from a variety of data sources to maximize revenues.
  • Supply chain and channel analytics. Predictive analytical models can help with preemptive replenishment, B2B supplier networks, inventory management, route optimizations and the notification of potential delays to deliveries.
  • Risk management. Big data analytics can identify new risks from data patterns for effective risk management strategies.
  • Improved decision-making. Insights business users extract from relevant data can help organizations make quicker and better decisions.

Big data analytics benefits

The benefits of using big data analytics include:

  • Quickly analyzing large amounts of data from different sources, in many different formats and types.
  • Rapidly making better-informed decisions for effective strategizing, which can benefit and improve the supply chain, operations and other areas of strategic decision-making.
  • Cost savings, which can result from new business process efficiencies and optimizations.
  • A better understanding of customer needs, behavior and sentiment, which can lead to better marketing insights, as well as provide information for product development.
  • Improved, better informed risk management strategies that draw from large sample sizes of data.
Which of the following is least likely to be performed efficiently using data analytics?
Big data analytics involves analyzing structured and unstructured data.

Big data analytics challenges

Despite the wide-reaching benefits that come with using big data analytics, its use also comes with challenges:

  • Accessibility of data. With larger amounts of data, storage and processing become more complicated. Big data should be stored and maintained properly to ensure it can be used by less experienced data scientists and analysts.
  • Data quality maintenance. With high volumes of data coming in from a variety of sources and in different formats, data quality management for big data requires significant time, effort and resources to properly maintain it.
  • Data security. The complexity of big data systems presents unique security challenges. Properly addressing security concerns within such a complicated big data ecosystem can be a complex undertaking.
  • Choosing the right tools. Selecting from the vast array of big data analytics tools and platforms available on the market can be confusing, so organizations must know how to pick the best tool that aligns with users' needs and infrastructure.
  • With a potential lack of internal analytics skills and the high cost of hiring experienced data scientists and engineers, some organizations are finding it hard to fill the gaps.

History and growth of big data analytics

The term big data was first used to refer to increasing data volumes in the mid-1990s. In 2001, Doug Laney, then an analyst at consultancy Meta Group Inc., expanded the definition of big data. This expansion described the increasing:

  • Volume of data being stored and used by organizations;
  • Variety of data being generated by organizations; and
  • Velocity, or speed, in which that data was being created and updated.

Those three factors became known as the 3Vs of big data. Gartner popularized this concept after acquiring Meta Group and hiring Laney in 2005.

Another significant development in the history of big data was the launch of the Hadoop distributed processing framework. Hadoop was launched as an Apache open source project in 2006. This planted the seeds for a clustered platform built on top of commodity hardware and that could run big data applications. The Hadoop framework of software tools is widely used for managing big data.

By 2011, big data analytics began to take a firm hold in organizations and the public eye, along with Hadoop and various related big data technologies.

Initially, as the Hadoop ecosystem took shape and started to mature, big data applications were primarily used by large internet and e-commerce companies such as Yahoo, Google and Facebook, as well as analytics and marketing services providers.

More recently, a broader variety of users have embraced big data analytics as a key technology driving digital transformation. Users include retailers, financial services firms, insurers, healthcare organizations, manufacturers, energy companies and other enterprises.

This was last updated in December 2021

Continue Reading About big data analytics

  • How to build an all-purpose big data pipeline architecture
  • 6 big data benefits for businesses
  • How to build an enterprise big data strategy in 4 steps
  • 10 big data challenges and how to address them
  • Top 25 big data glossary terms you should know

Dig Deeper on Data science and analytics

  • Which of the following is least likely to be performed efficiently using data analytics?
    Cloudera embraces Apache Iceberg as cloud data lake evolves

    Which of the following is least likely to be performed efficiently using data analytics?

    By: Sean Kerner

  • Which of the following is least likely to be performed efficiently using data analytics?
    Hadoop vs. Spark: An in-depth big data framework comparison

    Which of the following is least likely to be performed efficiently using data analytics?

    By: George Lawton

  • Which of the following is least likely to be performed efficiently using data analytics?
    Compare Hadoop vs. Spark vs. Kafka for your big data strategy

    Which of the following is least likely to be performed efficiently using data analytics?

    By: Daniel Robinson

  • Which of the following is least likely to be performed efficiently using data analytics?
    big data as a service (BDaaS)

    Which of the following is least likely to be performed efficiently using data analytics?

    By: Craig Stedman

Which of the following is an auditor least likely to consider a departure from generally accepted accounting principles?

Which of the following is an auditor least likely to consider a departure from U.S. generally accepted accounting principles? Including in inventory items that are consigned out to vendors, but not yet sold.

Which of the following would occur during the planning stage of audit data analytics?

Which of the following would occur during the planning stage of Audit Data Analytics? Auditor must determine the nature, timing, and extent of the work that will be completed as part of the ADA.
Which one of the following procedures would not be appropriate for the auditors in discharging their responsibilities concerning the client's physical inventories? Supervising the taking of the annual physical inventory.

Which of the following is not a part of the auditors responsibility?

The auditor has no responsibility to plan and perform the audit to obtain reasonable assurance that misstatements, whether caused by errors or fraud, that are not material to the financial statements are detected. .