Stay ahead by continuously learning and advancing your career.. Learn More

Alibaba Cloud Professional ACP Big Data Professional Practice Exam

description

Bookmark Enrolled Intermediate

Alibaba Cloud Professional ACP Big Data Professional Practice Exam


The Alibaba Big Data Certification (ACP Level - Alibaba Cloud Certified Professional) includes the fundamental distributed system theory and an array of Alibaba Cloud big data core services, such as MaxCompute, DataWorks, E-MapReduce, visualization, and BI tools, along with industry best practices. This certification evaluates individuals' proficiency in the following areas:

  • Having professional IT knowledge, including a grasp of distributed system concepts like Hadoop, fundamental database theory, and data analysis skills.
  • Ability to devise effective technological solutions and enterprise best practices leveraging Alibaba Cloud's big data platform.
  • Proficiency in utilizing Alibaba Cloud's big data computing service, online integrated development environment, and data integration products.
  • Capability to identify and resolve common issues encountered during the operation of business systems built on Alibaba Cloud Big Data products, employing optimal solutions.


Who should take the exam?

The Alibaba Big Data Certification (ACP Level - Alibaba Cloud Certified Professional) is for architects, developers, and O&M personnel utilizing Alibaba Big Data products. This certification is designed for those familiar with big data and operational knowledge of Alibaba Cloud big data products. 


Exam Details

  • Exam Name: Alibaba Cloud Professional ACP Big Data Professional
  • Question Format: Single selection/Multiple selections/True-or-false questions
  • Number of Questions: 60
  • Time Duration: 120 minutes
  • Passing Score: 65 out of 100
  • Exam Type: Closed-Book


Course Outline

The exam covers the following topics:

Topic 1: Learn about MaxCompute 37%

  • Familiar with big data computing services basic concepts, including project, table, partition, resources, task, etc.
  • Understand big data computing services including the composition of the structure and function of each component.
  • Master the characteristics, advantages and application scenarios of Alibaba Cloud big data computing services.
  • Know how to connect and use the computing services, including the use of client odpscmd, management console, Java SDK, etc.
  • Know how to do the big data computing service data upload and download, can use tunnel command line tools, understand the Tunnel SDK
  • Know how to use SQL commands for large data computing, including DDL, DML, and common built-in functions?
  • Familiar with user-defined functions, including UDF, UDAF, and UDTF, able to write simple custom functions.
  • Familiar with the MapReduce programming framework, can setup IntelliJ IDEA integrated development environment, and write a simple MapReduce program.
  • Understand the Graph programming framework, including basic concepts, processing procedures, can write a simple Graph program.
  • Familiar with the concept and practical operation of the security and permission management of MaxCompute, including users, roles, authorization (ACL & Policy), project space protection, external and security level, etc.


Topic 2: DataWorks 29%

  • Familiar with the basic functions of DataWorks, including data Integration, data development, data management, operation & maintenance center, organization management and project management.
  • Understand the basic features of DataWorks, including role isolation, environment isolation, etc.
  • Has knowledge about how to leverage project management and organizational management modules to build data analysis environment.
  • Proficient in the design and development of data development module of DataWorks, including construction table, task development, resource upload, data upload, new functions, etc.
  • Able to use DataWorks' data development module for workflow task and node task development and design, can configure appropriate dependencies and periodic scheduling.
  • Able to use the data management module for data management, including linage analysis, application and authorization of use of table, etc.
  • Able to fix the basic problems by identifying and locating the problems in the process.


Topic 3: Understand E-MapReduce 21%

  • Know about the basic distributed system theory, like the concept of distributed file system and distributed computing framework.
  • Know how the common components in Hadoop ecosystem work, e.g. distributed file system (HDFS), computation framework (MapReduce), resource management component (YARN) and resource coordination component (Zookeeper).
  • Familiar with the basic concepts of each component of EMapReduce, including YARN, Spark, Zookeeper, Kafka, etc.
  • Familiar with Auto Scaling features, product advantages and common application scenarios.


Topic 4: Alibaba Big Data Ecosystem Tools 13%

  • Has knowledge about the Alibaba Cloud Machine Learning Platform for AI (PAI).
  • Know about the Alibaba Cloud Steaming data process solution and products (Realtime Compute, Apache Flink).
  • Familiar with the basic concepts of Quick BI and has knowledge about workflow of how to use Quick BI get better insight of data.
  • Understand the features and application scenarios of other related products, including Alibaba Cloud RDS, distributed relational database DRDS, Table Store, Analytic database, Data Transmission Service (DTS), Realtime Compute and DataV, etc.
  • Understand how DataWorks data integration synchronizes data with other related products?

Reviews

Tags: Alibaba Cloud Professional ACP Big Data Professional Practice Exam, Alibaba Cloud Professional ACP Big Data Professional Free Test, Alibaba Cloud Professional ACP Big Data Professional Study Guide,

Alibaba Cloud Professional ACP Big Data Professional Practice Exam

Alibaba Cloud Professional ACP Big Data Professional Practice Exam

  • Test Code:8095-P
  • Availability:In Stock
  • $7.99

  • Ex Tax:$7.99


Alibaba Cloud Professional ACP Big Data Professional Practice Exam


The Alibaba Big Data Certification (ACP Level - Alibaba Cloud Certified Professional) includes the fundamental distributed system theory and an array of Alibaba Cloud big data core services, such as MaxCompute, DataWorks, E-MapReduce, visualization, and BI tools, along with industry best practices. This certification evaluates individuals' proficiency in the following areas:

  • Having professional IT knowledge, including a grasp of distributed system concepts like Hadoop, fundamental database theory, and data analysis skills.
  • Ability to devise effective technological solutions and enterprise best practices leveraging Alibaba Cloud's big data platform.
  • Proficiency in utilizing Alibaba Cloud's big data computing service, online integrated development environment, and data integration products.
  • Capability to identify and resolve common issues encountered during the operation of business systems built on Alibaba Cloud Big Data products, employing optimal solutions.


Who should take the exam?

The Alibaba Big Data Certification (ACP Level - Alibaba Cloud Certified Professional) is for architects, developers, and O&M personnel utilizing Alibaba Big Data products. This certification is designed for those familiar with big data and operational knowledge of Alibaba Cloud big data products. 


Exam Details

  • Exam Name: Alibaba Cloud Professional ACP Big Data Professional
  • Question Format: Single selection/Multiple selections/True-or-false questions
  • Number of Questions: 60
  • Time Duration: 120 minutes
  • Passing Score: 65 out of 100
  • Exam Type: Closed-Book


Course Outline

The exam covers the following topics:

Topic 1: Learn about MaxCompute 37%

  • Familiar with big data computing services basic concepts, including project, table, partition, resources, task, etc.
  • Understand big data computing services including the composition of the structure and function of each component.
  • Master the characteristics, advantages and application scenarios of Alibaba Cloud big data computing services.
  • Know how to connect and use the computing services, including the use of client odpscmd, management console, Java SDK, etc.
  • Know how to do the big data computing service data upload and download, can use tunnel command line tools, understand the Tunnel SDK
  • Know how to use SQL commands for large data computing, including DDL, DML, and common built-in functions?
  • Familiar with user-defined functions, including UDF, UDAF, and UDTF, able to write simple custom functions.
  • Familiar with the MapReduce programming framework, can setup IntelliJ IDEA integrated development environment, and write a simple MapReduce program.
  • Understand the Graph programming framework, including basic concepts, processing procedures, can write a simple Graph program.
  • Familiar with the concept and practical operation of the security and permission management of MaxCompute, including users, roles, authorization (ACL & Policy), project space protection, external and security level, etc.


Topic 2: DataWorks 29%

  • Familiar with the basic functions of DataWorks, including data Integration, data development, data management, operation & maintenance center, organization management and project management.
  • Understand the basic features of DataWorks, including role isolation, environment isolation, etc.
  • Has knowledge about how to leverage project management and organizational management modules to build data analysis environment.
  • Proficient in the design and development of data development module of DataWorks, including construction table, task development, resource upload, data upload, new functions, etc.
  • Able to use DataWorks' data development module for workflow task and node task development and design, can configure appropriate dependencies and periodic scheduling.
  • Able to use the data management module for data management, including linage analysis, application and authorization of use of table, etc.
  • Able to fix the basic problems by identifying and locating the problems in the process.


Topic 3: Understand E-MapReduce 21%

  • Know about the basic distributed system theory, like the concept of distributed file system and distributed computing framework.
  • Know how the common components in Hadoop ecosystem work, e.g. distributed file system (HDFS), computation framework (MapReduce), resource management component (YARN) and resource coordination component (Zookeeper).
  • Familiar with the basic concepts of each component of EMapReduce, including YARN, Spark, Zookeeper, Kafka, etc.
  • Familiar with Auto Scaling features, product advantages and common application scenarios.


Topic 4: Alibaba Big Data Ecosystem Tools 13%

  • Has knowledge about the Alibaba Cloud Machine Learning Platform for AI (PAI).
  • Know about the Alibaba Cloud Steaming data process solution and products (Realtime Compute, Apache Flink).
  • Familiar with the basic concepts of Quick BI and has knowledge about workflow of how to use Quick BI get better insight of data.
  • Understand the features and application scenarios of other related products, including Alibaba Cloud RDS, distributed relational database DRDS, Table Store, Analytic database, Data Transmission Service (DTS), Realtime Compute and DataV, etc.
  • Understand how DataWorks data integration synchronizes data with other related products?