Labour Day - Special 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 70dumps

ARA-R01 Questions and Answers

Question # 6

An Architect for a multi-national transportation company has a system that is used to check the weather conditions along vehicle routes. The data is provided to drivers.

The weather information is delivered regularly by a third-party company and this information is generated as JSON structure. Then the data is loaded into Snowflake in a column with a VARIANT data type. This

table is directly queried to deliver the statistics to the drivers with minimum time lapse.

A single entry includes (but is not limited to):

- Weather condition; cloudy, sunny, rainy, etc.

- Degree

- Longitude and latitude

- Timeframe

- Location address

- Wind

The table holds more than 10 years' worth of data in order to deliver the statistics from different years and locations. The amount of data on the table increases every day.

The drivers report that they are not receiving the weather statistics for their locations in time.

What can the Architect do to deliver the statistics to the drivers faster?

A.

Create an additional table in the schema for longitude and latitude. Determine a regular task to fill this information by extracting it from the JSON dataset.

B.

Add search optimization service on the variant column for longitude and latitude in order to query the information by using specific metadata.

C.

Divide the table into several tables for each year by using the timeframe information from the JSON dataset in order to process the queries in parallel.

D.

Divide the table into several tables for each location by using the location address information from the JSON dataset in order to process the queries in parallel.

Full Access
Question # 7

An Architect needs to grant a group of ORDER_ADMIN users the ability to clean old data in an ORDERS table (deleting all records older than 5 years), without granting any privileges on the table. The group’s manager (ORDER_MANAGER) has full DELETE privileges on the table.

How can the ORDER_ADMIN role be enabled to perform this data cleanup, without needing the DELETE privilege held by the ORDER_MANAGER role?

A.

Create a stored procedure that runs with caller’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

B.

Create a stored procedure that can be run using both caller’s and owner’s rights (allowing the user to specify which rights are used during execution), and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

C.

Create a stored procedure that runs with owner’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

D.

This scenario would actually not be possible in Snowflake – any user performing a DELETE on a table requires the DELETE privilege to be granted to the role they are using.

Full Access
Question # 8

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

A.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

Full Access
command to Snowflake when new files have arrived in Amazon S3 Glacier storage. This option is not relevant because it does not use Snowpipe, but rather the standard COPY command, which is a batch loading method. Moreover, the COPY command also does not support ingesting files from Amazon S3 Glacier storage7 References:
  • 1: SnowPro Advanced: Architect | Study Guide 8
  • 2: Snowflake Documentation | Snowpipe Overview 9
  • 3: Snowflake Documentation | Using the Snowpipe REST API 10
  • 4: Snowflake Documentation | Loading Data Using Snowpipe and AWS Lambda 11
  • 5: Snowflake Documentation | Supported File Formats and Compression for Staged Data Files 12
  • 6: Snowflake Documentation | Using Cloud Notifications to Trigger Snowpipe 13
  • 7: Snowflake Documentation | Loading Data Using COPY into a Table
  • : SnowPro Advanced: Architect | Study Guide
  • : Snowpipe Overview
  • : Using the Snowpipe REST API
  • : Loading Data Using Snowpipe and AWS Lambda
  • : Supported File Formats and Compression for Staged Data Files
  • : Using Cloud Notifications to Trigger Snowpipe
  • : Loading Data Using COPY into a Table
  • Question # 9

    When loading data from stage using COPY INTO, what options can you specify for the ON_ERROR clause?

    A.

    CONTINUE

    B.

    SKIP_FILE

    C.

    ABORT_STATEMENT

    D.

    FAIL

    Full Access

    Question # 10

    A Snowflake Architect is designing a multiple-account design strategy.

    This strategy will be MOST cost-effective with which scenarios? (Select TWO).

    A.

    The company wants to clone a production database that resides on AWS to a development database that resides on Azure.

    B.

    The company needs to share data between two databases, where one must support Payment Card Industry Data Security Standard (PCI DSS) compliance but the other one does not.

    C.

    The company needs to support different role-based access control features for the development, test, and production environments.

    D.

    The company security policy mandates the use of different Active Directory instances for the development, test, and production environments.

    E.

    The company must use a specific network policy for certain users to allow and block given IP addresses.

    Full Access
    Question # 11

    How can the Snowflake context functions be used to help determine whether a user is authorized to see data that has column-level security enforced? (Select TWO).

    A.

    Set masking policy conditions using current_role targeting the role in use for the current session.

    B.

    Set masking policy conditions using is_role_in_session targeting the role in use for the current account.

    C.

    Set masking policy conditions using invoker_role targeting the executing role in a SQL statement.

    D.

    Determine if there are ownership privileges on the masking policy that would allow the use of any function.

    E.

    Assign the accountadmin role to the user who is executing the object.

    Full Access
    Question # 12

    Which command will create a schema without Fail-safe and will restrict object owners from passing on access to other users?

    A.

    create schema EDW.ACCOUNTING WITH MANAGED ACCESS;

    B.

    create schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS - 7;

    C.

    create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 1;

    D.

    create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 7;

    Full Access
    Question # 13

    Which SQL alter command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?

    A)

    B)

    C)

    D)

    A.

    Option A

    B.

    Option B

    C.

    Option C

    D.

    Option D

    Full Access
    Question # 14

    A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

    Which requirements will be addressed with this approach? (Choose two.)

    A.

    There needs to be fewer objects per tenant.

    B.

    Security and Role-Based Access Control (RBAC) policies must be simple to configure.

    C.

    Compute costs must be optimized.

    D.

    Tenant data shape may be unique per tenant.

    E.

    Storage costs must be optimized.

    Full Access
    Question # 15

    Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

    A.

    External table

    B.

    Materialized view

    C.

    Search optimization

    D.

    Result cache

    Full Access
    Question # 16

    An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

    Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?

    A.

    Use the Snowflake Connector for Python, connect to remote storage and download the file.

    B.

    Use the get command in SnowSQL to retrieve the file.

    C.

    Use the get command in Snowsight to retrieve the file.

    D.

    Use the Snowflake API endpoint and download the file.

    Full Access
    Question # 17

    What Snowflake system functions are used to view and or monitor the clustering metadata for a table? (Select TWO).

    A.

    SYSTEMSCLUSTERING

    B.

    SYSTEMSTABLE_CLUSTERING

    C.

    SYSTEMSCLUSTERING_DEPTH

    D.

    SYSTEMSCLUSTERING_RATIO

    E.

    SYSTEMSCLUSTERING_INFORMATION

    Full Access
    Question # 18

    You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

    What type of table you will use in this case to optimize cost

    A.

    TRANSIENT

    B.

    TEMPORARY

    C.

    PERMANENT

    Full Access
    Question # 19

    An Architect entered the following commands in sequence:

    USER1 cannot find the table.

    Which of the following commands does the Architect need to run for USER1 to find the tables using the Principle of Least Privilege? (Choose two.)

    A.

    GRANT ROLE PUBLIC TO ROLE INTERN;

    B.

    GRANT USAGE ON DATABASE SANDBOX TO ROLE INTERN;

    C.

    GRANT USAGE ON SCHEMA SANDBOX.PUBLIC TO ROLE INTERN;

    D.

    GRANT OWNERSHIP ON DATABASE SANDBOX TO USER INTERN;

    E.

    GRANT ALL PRIVILEGES ON DATABASE SANDBOX TO ROLE INTERN;

    Full Access
    Question # 20

    How can the Snowpipe REST API be used to keep a log of data load history?

    A.

    Call insertReport every 20 minutes, fetching the last 10,000 entries.

    B.

    Call loadHistoryScan every minute for the maximum time range.

    C.

    Call insertReport every 8 minutes for a 10-minute time range.

    D.

    Call loadHistoryScan every 10 minutes for a 15-minutes range.

    Full Access
    Question # 21

    A table, EMP_ TBL has three records as shown:

    The following variables are set for the session:

    Which SELECT statements will retrieve all three records? (Select TWO).

    A.

    Select * FROM Stbl_ref WHERE Scol_ref IN ('Name1','Nam2','Name3');

    B.

    SELECT * FROM EMP_TBL WHERE identifier(Scol_ref) IN ('Namel','Name2', 'Name3');

    C.

    SELECT * FROM identifier WHERE NAME IN ($var1, $var2, $var3);

    D.

    SELECT * FROM identifier($tbl_ref) WHERE ID IN Cvarl','var2','var3');

    E.

    SELECT * FROM $tb1_ref WHERE $col_ref IN ($var1, Svar2, Svar3);

    Full Access
    Question # 22

    What Snowflake features should be leveraged when modeling using Data Vault?

    A.

    Snowflake’s support of multi-table inserts into the data model’s Data Vault tables

    B.

    Data needs to be pre-partitioned to obtain a superior data access performance

    C.

    Scaling up the virtual warehouses will support parallel processing of new source loads

    D.

    Snowflake’s ability to hash keys so that hash key joins can run faster than integer joins

    Full Access
    Question # 23

    Which of the following are characteristics of Snowflake’s parameter hierarchy?

    A.

    Session parameters override virtual warehouse parameters.

    B.

    Virtual warehouse parameters override user parameters.

    C.

    Table parameters override virtual warehouse parameters.

    D.

    Schema parameters override account parameters.

    Full Access
    Question # 24

    Files arrive in an external stage every 10 seconds from a proprietary system. The files range in size from 500 K to 3 MB. The data must be accessible by dashboards as soon as it arrives.

    How can a Snowflake Architect meet this requirement with the LEAST amount of coding? (Choose two.)

    A.

    Use Snowpipe with auto-ingest.

    B.

    Use a COPY command with a task.

    C.

    Use a materialized view on an external table.

    D.

    Use the COPY INTO command.

    E.

    Use a combination of a task and a stream.

    Full Access
    Question # 25

    An Architect needs to design a data unloading strategy for Snowflake, that will be used with the COPY INTO command.

    Which configuration is valid?

    A.

    Location of files: Snowflake internal location

    . File formats: CSV, XML

    . File encoding: UTF-8

    . Encryption: 128-bit

    B.

    Location of files: Amazon S3

    . File formats: CSV, JSON

    . File encoding: Latin-1 (ISO-8859)

    . Encryption: 128-bit

    C.

    Location of files: Google Cloud Storage

    . File formats: Parquet

    . File encoding: UTF-8

    · Compression: gzip

    D.

    Location of files: Azure ADLS

    . File formats: JSON, XML, Avro, Parquet, ORC

    . Compression: bzip2

    . Encryption: User-supplied key

    Full Access
    Question # 26

    A retailer's enterprise data organization is exploring the use of Data Vault 2.0 to model its data lake solution. A Snowflake Architect has been asked to provide recommendations for using Data Vault 2.0 on Snowflake.

    What should the Architect tell the data organization? (Select TWO).

    A.

    Change data capture can be performed using the Data Vault 2.0 HASH_DIFF concept.

    B.

    Change data capture can be performed using the Data Vault 2.0 HASH_DELTA concept.

    C.

    Using the multi-table insert feature in Snowflake, multiple Point-in-Time (PIT) tables can be loaded in parallel from a single join query from the data vault.

    D.

    Using the multi-table insert feature, multiple Point-in-Time (PIT) tables can be loaded sequentially from a single join query from the data vault.

    E.

    There are performance challenges when using Snowflake to load multiple Point-in-Time (PIT) tables in parallel from a single join query from the data vault.

    Full Access
    Question # 27

    An Architect is designing a data lake with Snowflake. The company has structured, semi-structured, and unstructured data. The company wants to save the data inside the data lake within the Snowflake

    system. The company is planning on sharing data among its corporate branches using Snowflake data sharing.

    What should be considered when sharing the unstructured data within Snowflake?

    A.

    A pre-signed URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with no time limit for the URL.

    B.

    A scoped URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 24-hour time limit for the URL.

    C.

    A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with a 7-day time limit for the URL.

    D.

    A file URL should be used to save the unstructured data into Snowflake in order to share data over secure views, with the "expiration_time" argument defined for the URL time limit.

    Full Access
    Question # 28

    The following DDL command was used to create a task based on a stream:

    Assuming MY_WH is set to auto_suspend – 60 and used exclusively for this task, which statement is true?

    A.

    The warehouse MY_WH will be made active every five minutes to check the stream.

    B.

    The warehouse MY_WH will only be active when there are results in the stream.

    C.

    The warehouse MY_WH will never suspend.

    D.

    The warehouse MY_WH will automatically resize to accommodate the size of the stream.

    Full Access
    Question # 29

    Which data models can be used when modeling tables in a Snowflake environment? (Select THREE).

    A.

    Graph model

    B.

    Dimensional/Kimball

    C.

    Data lake

    D.

    lnmon/3NF

    E.

    Bayesian hierarchical model

    F.

    Data vault

    Full Access
    Question # 30

    A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

    How should these requirements be met?

    A.

    Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

    B.

    Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

    C.

    Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

    D.

    Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

    Full Access
    Question # 31

    A company is storing large numbers of small JSON files (ranging from 1-4 bytes) that are received from IoT devices and sent to a cloud provider. In any given hour, 100,000 files are added to the cloud provider.

    What is the MOST cost-effective way to bring this data into a Snowflake table?

    A.

    An external table

    B.

    A pipe

    C.

    A stream

    D.

    A copy command at regular intervals

    Full Access

    Question # 32

    Which steps are recommended best practices for prioritizing cluster keys in Snowflake? (Choose two.)

    A.

    Choose columns that are frequently used in join predicates.

    B.

    Choose lower cardinality columns to support clustering keys and cost effectiveness.

    C.

    Choose TIMESTAMP columns with nanoseconds for the highest number of unique rows.

    D.

    Choose cluster columns that are most actively used in selective filters.

    E.

    Choose cluster columns that are actively used in the GROUP BY clauses.

    Full Access
    Question # 33

    A new table and streams are created with the following commands:

    CREATE OR REPLACE TABLE LETTERS (ID INT, LETTER STRING) ;

    CREATE OR REPLACE STREAM STREAM_1 ON TABLE LETTERS;

    CREATE OR REPLACE STREAM STREAM_2 ON TABLE LETTERS APPEND_ONLY = TRUE;

    The following operations are processed on the newly created table:

    INSERT INTO LETTERS VALUES (1, 'A');

    INSERT INTO LETTERS VALUES (2, 'B');

    INSERT INTO LETTERS VALUES (3, 'C');

    TRUNCATE TABLE LETTERS;

    INSERT INTO LETTERS VALUES (4, 'D');

    INSERT INTO LETTERS VALUES (5, 'E');

    INSERT INTO LETTERS VALUES (6, 'F');

    DELETE FROM LETTERS WHERE ID = 6;

    What would be the output of the following SQL commands, in order?

    SELECT COUNT (*) FROM STREAM_1;

    SELECT COUNT (*) FROM STREAM_2;

    A.

    2 & 6

    B.

    2 & 3

    C.

    4 & 3

    D.

    4 & 6

    Full Access
    Question # 34

    Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Select THREE).

    A.

    Database

    B.

    Schema

    C.

    Table

    D.

    Stage

    E.

    Role

    F.

    Warehouse

    Full Access
    Question # 35

    Which of the following ingestion methods can be used to load near real-time data by using the messaging services provided by a cloud provider?

    A.

    Snowflake Connector for Kafka

    B.

    Snowflake streams

    C.

    Snowpipe

    D.

    Spark

    Full Access
    Question # 36

    A company is designing high availability and disaster recovery plans and needs to maximize redundancy and minimize recovery time objectives for their critical application processes. Cost is not a concern as long as the solution is the best available. The plan so far consists of the following steps:

    1. Deployment of Snowflake accounts on two different cloud providers.

    2. Selection of cloud provider regions that are geographically far apart.

    3. The Snowflake deployment will replicate the databases and account data between both cloud provider accounts.

    4. Implementation of Snowflake client redirect.

    What is the MOST cost-effective way to provide the HIGHEST uptime and LEAST application disruption if there is a service event?

    A.

    Connect the applications using the - URL. Use the Business Critical Snowflake edition.

    B.

    Connect the applications using the - URL. Use the Virtual Private Snowflake (VPS) edition.

    C.

    Connect the applications using the - URL. Use the Enterprise Snowflake edition.

    D.

    Connect the applications using the - URL. Use the Business Critical Snowflake edition.

    Full Access
    Question # 37

    A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

    How can these requirements be met?

    A.

    Use ON_ERROR = continue in the copy into command.

    B.

    Use purge = TRUE in the copy into command.

    C.

    Use FURGE = FALSE in the copy into command.

    D.

    Use on error = SKIP_FILE in the copy into command.

    Full Access
    Question # 38

    Why might a Snowflake Architect use a star schema model rather than a 3NF model when designing a data architecture to run in Snowflake? (Select TWO).

    A.

    Snowflake cannot handle the joins implied in a 3NF data model.

    B.

    The Architect wants to remove data duplication from the data stored in Snowflake.

    C.

    The Architect is designing a landing zone to receive raw data into Snowflake.

    D.

    The Bl tool needs a data model that allows users to summarize facts across different dimensions, or to drill down from the summaries.

    E.

    The Architect wants to present a simple flattened single view of the data to a particular group of end users.

    Full Access
    Question # 39

    What are characteristics of Dynamic Data Masking? (Select TWO).

    A.

    A masking policy that Is currently set on a table can be dropped.

    B.

    A single masking policy can be applied to columns in different tables.

    C.

    A masking policy can be applied to the value column of an external table.

    D.

    The role that creates the masking policy will always see unmasked data In query results

    E.

    A masking policy can be applied to a column with the GEOGRAPHY data type.

    Full Access
    Question # 40

    An Architect needs to improve the performance of reports that pull data from multiple Snowflake tables, join, and then aggregate the data. Users access the reports using several dashboards. There are performance issues on Monday mornings between 9:00am-11:00am when many users check the sales reports.

    The size of the group has increased from 4 to 8 users. Waiting times to refresh the dashboards has increased significantly. Currently this workload is being served by a virtual warehouse with the following parameters:

    AUTO-RESUME = TRUE AUTO_SUSPEND = 60 SIZE = Medium

    What is the MOST cost-effective way to increase the availability of the reports?

    A.

    Use materialized views and pre-calculate the data.

    B.

    Increase the warehouse to size Large and set auto_suspend = 600.

    C.

    Use a multi-cluster warehouse in maximized mode with 2 size Medium clusters.

    D.

    Use a multi-cluster warehouse in auto-scale mode with 1 size Medium cluster, and set min_cluster_count = 1 and max_cluster_count = 4.

    Full Access
    Question # 41

    A user named USER_01 needs access to create a materialized view on a schema EDW. STG_SCHEMA. How can this access be provided?

    A.

    GRANT CREATE MATERIALIZED VIEW ON SCHEMA EDW.STG_SCHEMA TO USER USER_01;

    B.

    GRANT CREATE MATERIALIZED VIEW ON DATABASE EDW TO USER USERJD1;

    C.

    GRANT ROLE NEW_ROLE TO USER USER_01;

    GRANT CREATE MATERIALIZED VIEW ON SCHEMA ECW.STG_SCHEKA TO NEW_ROLE;

    D.

    GRANT ROLE NEW_ROLE TO USER_01;

    GRANT CREATE MATERIALIZED VIEW ON EDW.STG_SCHEMA TO NEW_ROLE;

    Full Access
    Question # 42

    What is a key consideration when setting up search optimization service for a table?

    A.

    Search optimization service works best with a column that has a minimum of 100 K distinct values.

    B.

    Search optimization service can significantly improve query performance on partitioned external tables.

    C.

    Search optimization service can help to optimize storage usage by compressing the data into a GZIP format.

    D.

    The table must be clustered with a key having multiple columns for effective search optimization.

    Full Access
    Question # 43

    Which technique will efficiently ingest and consume semi-structured data for Snowflake data lake workloads?

    A.

    IDEF1X

    B.

    Schema-on-write

    C.

    Schema-on-read

    D.

    Information schema

    Full Access
    Question # 44

    An Architect clones a database and all of its objects, including tasks. After the cloning, the tasks stop running.

    Why is this occurring?

    A.

    Tasks cannot be cloned.

    B.

    The objects that the tasks reference are not fully qualified.

    C.

    Cloned tasks are suspended by default and must be manually resumed.

    D.

    The Architect has insufficient privileges to alter tasks on the cloned database.

    Full Access
    Question # 45

    An Architect is designing a solution that will be used to process changed records in an orders table. Newly-inserted orders must be loaded into the f_orders fact table, which will aggregate all the orders by multiple dimensions (time, region, channel, etc.). Existing orders can be updated by the sales department within 30 days after the order creation. In case of an order update, the solution must perform two actions:

    1. Update the order in the f_0RDERS fact table.

    2. Load the changed order data into the special table ORDER _REPAIRS.

    This table is used by the Accounting department once a month. If the order has been changed, the Accounting team needs to know the latest details and perform the necessary actions based on the data in the order_repairs table.

    What data processing logic design will be the MOST performant?

    A.

    Useone stream and one task.

    B.

    Useone stream and two tasks.

    C.

    Usetwo streams and one task.

    D.

    Usetwo streams and two tasks.

    Full Access
    Question # 46

    How does a standard virtual warehouse policy work in Snowflake?

    A.

    It conserves credits by keeping running clusters fully loaded rather than starting additional clusters.

    B.

    It starts only if the system estimates that there is a query load that will keep the cluster busy for at least 6 minutes.

    C.

    It starts only f the system estimates that there is a query load that will keep the cluster busy for at least 2 minutes.

    D.

    It prevents or minimizes queuing by starting additional clusters instead of conserving credits.

    Full Access
    Question # 47

    Which of the following are characteristics of how row access policies can be applied to external tables? (Choose three.)

    A.

    An external table can be created with a row access policy, and the policy can be applied to the VALUE column.

    B.

    A row access policy can be applied to the VALUE column of an existing external table.

    C.

    A row access policy cannot be directly added to a virtual column of an external table.

    D.

    External tables are supported as mapping tables in a row access policy.

    E.

    While cloning a database, both the row access policy and the external table will be cloned.

    F.

    A row access policy cannot be applied to a view created on top of an external table.

    Full Access
    Question # 48

    A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

    The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

    Which design will meet these requirements?

    A.

    Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    B.

    Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    C.

    Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

    D.

    Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

    Full Access