Skip to main content

Snowflake SnowPro Specialty Gen AI

The Snowflake SnowPro Specialty: Gen AI certification validates your ability to design, build, and optimize generative AI solutions using the Snowflake Data Cloud. It is a specialty-level certification intended for professionals who already understand Snowflake fundamentals and want to demonstrate expertise in AI/ML and generative AI workflows inside Snowflake.



---------- Question 1
Which view should an administrator query to monitor the number of tokens consumed by specific Cortex LLM functions over the last 30 days for cost tracking purposes?
  1. CORTEX_FUNCTIONS_USAGE_HISTORY
  2. WAREHOUSE_METERING_HISTORY
  3. CORTEX_BILLING_METRICS
  4. DATA_TRANSFER_HISTORY

---------- Question 2
What is a key document requirement for uploading files to Document AI to ensure successful model training and extraction?
  1. Files must be in .xlsx or .csv format only.
  2. Individual files must not exceed the size limit, typically 100MB, and should be in supported formats like PDF.
  3. Documents must be pre-encrypted using a custom Snowflake managed key.
  4. Files must be converted to raw HTML before they can be processed by the model.

---------- Question 3
When training a model in Document AI, what is the recommended best practice for optimizing the questions used to extract values from documents?
  1. Use highly technical jargon and internal database column names as questions.
  2. Use natural language questions that are clear, concise, and descriptive of the target value.
  3. Format all questions as complex SQL regex patterns to guide the model.
  4. Limit questions to single words like Date or Price to avoid model confusion.

---------- Question 4
In the context of Cortex LLM Functions, what is the specific purpose of the CORTEX_ENABLED_CROSS_REGION parameter when set to TRUE at the account level?
  1. It allows the Model Registry to sync versions across different Snowflake regions
  2. It enables Document AI to process files stored in stages located in different regions
  3. It allows Snowflake to route LLM requests to other regions if local capacity is unavailable
  4. It synchronizes RBAC roles and privileges across a global Snowflake organization

---------- Question 5
A data engineer needs to generate a JSON object from a large block of unstructured text using the SNOWFLAKE.CORTEX.COMPLETE function. Which approach ensures the output strictly adheres to a valid JSON schema?
  1. Use the SENTIMENT function and cast the result to VARIANT
  2. Pass a JSON schema into the options argument of the COMPLETE function
  3. Wrap the COMPLETE function inside a Snowflake Scripting TRY-CATCH block
  4. Increase the temperature parameter to 1.0 to encourage formatting

---------- Question 6
In a RAG application using Snowflake, a developer needs to convert a large collection of PDF text into a format suitable for semantic search. Which specific function should be used to generate the numerical representations required for a vector column?
  1. SNOWFLAKE.CORTEX.EMBED_TEXT_1024
  2. SNOWFLAKE.CORTEX.PARSE_DOCUMENT
  3. SNOWFLAKE.CORTEX.CLASSIFY_TEXT
  4. SNOWFLAKE.CORTEX.EXTRACT_ANSWER

---------- Question 7
To prevent the LLM from generating responses that contain hate speech or socially sensitive content, which feature should be utilized within the COMPLETE function arguments as a guardrail?
  1. Cortex Search
  2. Cortex Guard
  3. CORTEX_MODELS_ALLOWLIST
  4. Role-Based Access Control

---------- Question 8
A data engineer needs to generate a JSON-formatted response from a Snowflake Cortex LLM to ensure the output can be parsed by a downstream application. Which function or argument should be utilized?
  1. SNOWFLAKE.CORTEX.COMPLETE with Structured Outputs
  2. SNOWFLAKE.CORTEX.EXTRACT_ANSWER
  3. SNOWFLAKE.CORTEX.PARSE_DOCUMENT
  4. The SPLIT_TEXT_RECURSIVE_CHARACTER helper function

---------- Question 9
To monitor and optimize Snowflake Cortex spending, which system view should an administrator query to find the specific number of tokens processed for every individual function call made in the last 24 hours?
  1. METERING_DAILY_HISTORY
  2. CORTEX_FUNCTIONS_USAGE_HISTORY
  3. CORTEX_FUNCTIONS_QUERY_USAGE_HISTORY
  4. WAREHOUSE_METERING_HISTORY

---------- Question 10
When configuring Cortex Analyst, what is the purpose of the Verified Query Repository (VQR)?
  1. To store vector embeddings of the unstructured documentation
  2. To provide a set of 'ground truth' SQL examples that the model uses to improve accuracy
  3. To maintain a list of all users allowed to access the semantic views
  4. To cache the results of frequently asked natural language questions


Are they useful?
Click here to get 330 more questions to pass this certification at the first try! Explanation for each option is included!

Follow the below LINKEDIN channel to stay updated about 89+ exams!

Comments

Popular posts from this blog

Microsoft Certified: Azure Fundamentals (AZ-900)

The Microsoft Certified: Azure Fundamentals (AZ-900) is the essential starting point for anyone looking to validate their foundational knowledge of cloud services and how those services are provided with Microsoft Azure. It is designed for both technical and non-technical professionals ---------- Question 1 A new junior administrator has joined your IT team and needs to manage virtual machines for a specific development project within your Azure subscription. This project has its own dedicated resource group called dev-project-rg. The administrator should be able to start, stop, and reboot virtual machines, but should not be able to delete them or modify network configurations, and crucially, should not have access to virtual machines or resources in other projects or subscription-level settings. Which Azure identity and access management concept, along with its appropriate scope, should be used to grant these specific permissions? Microsoft Entra ID Conditional Access, applied at...

Google Associate Cloud Engineer

The Google Associate Cloud Engineer (ACE) certification validates the fundamental skills needed to deploy applications, monitor operations, and manage enterprise solutions on the Google Cloud Platform (GCP). It is considered the "gatekeeper" certification, proving a candidate's ability to perform practical cloud engineering tasks rather than just understanding theoretical architecture.  ---------- Question 1 Your team is developing a serverless application using Cloud Functions that needs to process data from Cloud Storage. When a new object is uploaded to a specific Cloud Storage bucket, the Cloud Function should automatically trigger and process the data. How can you achieve this? Use Cloud Pub/Sub as a message broker between Cloud Storage and Cloud Functions. Directly access Cloud Storage from the Cloud Function using the Cloud Storage Client Library. Use Cloud Scheduler to periodically check for new objects in the bucket. Configure Cloud Storage to directly ca...

CompTIA Cybersecurity Analyst (CySA+)

CompTIA Cybersecurity Analyst (CySA+) focuses on incident detection, prevention, and response through continuous security monitoring. It validates a professional's expertise in vulnerability management and the use of threat intelligence to strengthen organizational security. Achieving the symbol COMP_CYSA marks an individual as a proficient security analyst capable of mitigating modern cyber threats. ---------- Question 1 A security analyst is reviewing logs in the SIEM and identifies a series of unusual PowerShell executions on a critical application server. The logs show the use of the -EncodedCommand flag followed by a long Base64 string. Upon decoding, the script appears to be performing memory injection into a legitimate system process. Which of the following is the most likely indicator of malicious activity being observed, and what should be the analysts immediate technical response using scripting or tools? The activity indicates a fileless malware attack attempting to ...