Alteryx Artificial Intelligence

Frequently Asked Questions

 

AI Features: General

 
Do Alteryx Products and Services include AI?

Alteryx products may include AI features. We may also give the user the ability to connect to AI provided by another vendor, such as OpenAI. For example, our Alteryx Designer software can use the OpenAI Connector and the Workflow Summary tools. These tools allow users to connect to OpenAI with the user’s own OpenAI API. For our cloud products, such as Auto Insights, we may include AI features such as Playbooks and Magic Documents, which connect to Microsoft Azure Cognitive Services. And our AI Studio cloud product allows customers to use Large Language Models either from within the customer’s own cloud environment using Private Data Handling or in Alteryx’s managed environment. These are three examples of how we make AI available to our customers. For more information, please see the documentation (https://help.alteryx.com) for the specific product you’re using or are interested in using.

 
When I connect with your customer support will I be talking to a human or chatbot?

When you contact our customer support, you will be talking to a human. In our cloud products, such as Alteryx Designer Cloud, we offer an AI chatbot named “Fin.” Fin can be used to get quick answers to common questions. If at any time a user would rather speak with a human, we let the users know that they can be connected to a customer service representative at any time.

 
How does Alteryx notify customers when AI features are included in the products license?

As AI features are introduced into our products, we will provide relevant information directly to the users through the user interface within our products. We will also include information about these features in our product release notes and in updates our product documentation. To see our product release notes and documentation, please go to https://help.alteryx.com/.

 
How will we know when you modify one of your AI features?

When we modify an AI feature in one of our products, we will provide information about that change in our product release notes.  You can see these release notes at https://help.alteryx.com/.

 

AI Features and Your Data

 
Do you train your AI on my data?

We do not train our AI models on customer data or personal data collected from users of our software.

 
Do you use Retrieval Augmented Generation (RAG) and what happens to the data used for RAG?

Alteryx users can configure Retrieval Augmented Generation (“RAG”) systems using AI Studio in conjunction with other parts of the Alteryx platform including Designer and/or App Builder. If a customer has Private Data Handling enabled, the RAG data remains within the customer’s managed Virtual Private Cloud. Otherwise, the data is securely stored in Alteryx’s managed data plane, which is hosted on Amazon Web Services (“AWS”) infrastructure.

 
Who owns and has access to the output generated from your AI?

As between you (the customer) and Alteryx, you own any original output generated from the AI features provided by Alteryx to you. Ownership of non-original output remains unchanged. For instance, if an output includes a quote from a published research article, any copyright associated with that quote remains with the author. You should also be aware that ownership rights may be affected by the content of inputs you use to produce the output. Similar inputs that you or someone else uses may produce similar, non-original outputs, particularly so if the LLM temperature settings (i.e. the setting that determines how creative the output will be) are at lower settings. You should always check to verify the originality of your outputs if you plan to claim any type of ownership rights.

 
Do you share our AI prompts with third parties?

A customer’s AI prompts are Customer Content, meaning as between Alteryx and the customer, we consider it the Customer’s information that we do not share with third parties. If the AI feature you are using connects you to a third party hosted LLM, such as OpenAI’s ChatGPT or a Google Gemini, your prompts will be processed by those LLMs. Please refer to the third party’s terms of use associated with those LLMs.

 
Can your AI model be customized to our specific needs?

For Alteryx AI Studio, you can fine-tune the chosen LLM or use Retrieval Augmented Generation (“RAG”) to inform the LLM on your data. For other products, such as Playbooks and Magic Documents in Auto Insights, the AI model is not customized, but our products are designed to help you apply the power of the LLM to gain valuable insights and help you better understand your data. For Alteryx AiDIN Co-Pilot, we will customize that AI Feature with Alteryx data to provide you insights and assistance to build better workflows more efficiently.

 
Where does my data go when I prompt a model?

When you prompt a model in one of our products, that prompt will be processed just as any data would. If the AI feature is using a third party LLM, your prompt will be forwarded to the third party LLM in order to generate a response from the LLM. Please refer to the data handling information provided by that third party. In AI Studio, if you choose to use a local LLM, your prompt will be processed in a data plane either in your environment (if Private Data Handling is enabled) or in our managed cloud environment (e.g. in AWS, Google Cloud or Microsoft Azure). If we provide an LLM ourselves, your data is processed through that LLM but not stored beyond what how we would already store your workflow data.

 

Transparency and Explainability

 
Do you train your own LLMs or do you use third party LLMs?

For some services we let you use your own API to connect to services like OpenAI. Examples include the OpenAI Connector and Workflow Summary tools in Designer. For services like our Magic Documents and Playbooks in Auto Insights, we connect to OpenAI through Microsoft Azure Cognitive Services. For Alteryx Co-Pilot, we currently use Google Gemini. So, in those instances, OpenAI and Google are responsible for training their LLMs. Your data used by Magic Documents and Playbooks in Auto Insights is not used to improve Azure OpenAI Models. For more information, go to https://learn.microsoft.com/en-us/legal/cognitive-services/openai/data-privacy. For AI Studio, we let you choose your third party LLM, which means the third party you choose trains the LLM. If we provide any LLMs ourselves, rest assured we will train that LLM on our data and not yours.

 
Can I choose which LLMs I can use?

For AI Studio, you can choose from a list of LLMs. For Auto Insights, our AI features currently use Microsoft Azure Cognitive Services. Alteryx Co-Pilot currently uses Google Gemini. For our Designer software, we make available tools, such as the OpenAI Connector and Workflow Summaries that allow you to connect to OpenAI through your OpenAI API. You may also design your own connectors using the functionality available in Designer to connect to other AI-based services. Please visit the product documentation (https://help.alteryx.com) to learn more about the LLMs in use or the other options available to you.

 
Where are the LLMs hosted?

For Alteryx AI Studio, the LLM may be hosted in your environment using our Private Data Handling configuration or have Alteryx host the LLM in our managed environment. For more information about Private Data Handling, please see our Private Data Handling Whitepaper located on our Trust Page (www.atleryx.com/trust). For other AI features included in our products, we use third party LLMs (Open AI through Microsoft Azure Cognitive Services or Google Gemini). If we decide to use our own LLM in the future, we will host that LLM in the cloud using services such as AWS, Microsoft Azure or Google Cloud Platform.

 
How are your LLMs updated and improved over time?

Most of what we provide will be through the use of third party LLMs, such as Microsoft Cognitive Services (using OpenAI) and Google Gemini. As such, the extent and timing of any updates to these LLMs will be up to the third party providing the LLM. If you’ve purchased a product, such as AI Studio, that permits you to host another third party LLM, then it will also be your choice when to update that LLM with updated offerings from that third party. If we use our own LLM, we will update you through our documentation and release notes as to any updates to the LLM.

 
Is the output generated by your AI accurate?

We develop our AI features to be as accurate as possible. Accuracy depends on many factors, such as the quality of the input or prompt used to create the AI output, the LLM chosen, the temperature settings when using the AI, and other factors. As a result, accuracy can never be assured. Regardless of what AI you choose or where you get it from, you should always verify the accuracy of any output. We recommend building processes and procedures to consistently verify AI output. Maybe even consider how you can use Alteryx products to help automate those processes.

 

Trust and Accountability/Reliability and Safety

 
How do you assess the risks of using AI in your products and services?

Alteryx conducts comprehensive risk assessments and audits of AI technologies to mitigate potential biases or inaccuracies, incorporating feedback loops for ongoing improvement.

 
How do you address potential bias or discrimination in LLMs?

If a third party LLM is used, such as OpenAI, you should investigate the third party’s practices to address potential bias. As for Alteryx, we are committed to making AI technologies that are free from unfair discrimination or bias. Techniques such as diverse dataset training, algorithmic fairness checks, and continuous model evaluation will be employed to address bias and discrimination. Please visit or Alteryx Responsible AI Principles website at https://www.alteryx.com/trust/ai-principles for more information about our commitment to responsible AI practices.

 
What is prompt injection and how is it addressed in your solution?

Prompt injection is when an attacker enters a text prompt or instructs the AI to retrieve certain external data intended to give the attacker the ability to perform unauthorized actions. In the case of AI, prompt injection might let the attacker make the AI say anything they want it to say, and can make the AI ignore previous instructions in favor of later ones. At Alteryx, we have several safeguard mechanisms in place to prevent such scenarios. These include stringent input validation procedures and the use of AI models that can detect and filter out potentially harmful instructions.

 

AI Features in our Products

 
What is "AiDIN"?

AiDIN is the collection of AI functionality available from Alteryx. To learn more about AiDIN, please visit https://www.alteryx.com/aidin.

 
What are Auto Insights Playbooks?

Playbooks automate the manually intensive process of identifying, selecting, and building new analytics use cases for end users. Playbooks saves time and effort for business stakeholders and analytics professionals alike through automated use case generation, and automated sample dataset generation. Playbooks execute several API calls as the user browses through the Playbooks interface to generate use cases, synthetic data, reports, and more. For Playbooks with synthetic data, Auto Insights only sends the information the user enters into the text box (for example, role, company, department, or problem space) and the use case and report selection (both generated by AI) to Microsoft Cognitive Services (which uses OpenAI). For Playbooks with your real data, Auto Insights only sends the structure and randomized sample values of datasets used to generate use cases as well as the use case and report selection (both generated by AI) to OpenAI. All data is encrypted in transit and at rest within and between Alteryx Auto Insights and Microsoft Azure Cognitive Services. For more information, go to the https://learn.microsoft.com/en-us/legal/cognitive-services/openai/data-privacy.

 
What are Auto Insights Magic Documents?

Magic Documents is an Auto Insights feature that uses generative AI to generate dynamic content. Magic Documents work by connecting to Open AI via Microsoft Azure Cognitive Services. Your Organization Administrator can turn off or on the Magic Documents feature in the Admin Portal under the Feature Access tab. For information on how Magic Documents uses your data, go to the Usage of Magic Documents and Data Security article (https://help.alteryx.com/aac/en/auto-insights/data-security/data-security-when-using-magic-documents-or-playbooks.html). Auto Insights only sends the findings that are presented in the Mission Summary, as well as configurations for each document that you select, such as:

  • Format: Email, Presentation, or Message
  • Audience
  • Include or exclude recommendations
  • Objective
  • Tone
  • Language
  • Length of the returned document
  • Topic
 
What is the Designer OpenAI Connector?

OpenAI connector enables generative AI to be embedded in Alteryx Designer workflows. The OpenAI connector is a Designer Desktop tool that connects to OpenAI through your provided OpenAI API to generate natural language outputs from user data and analytics. For more information, please see: https://community.alteryx.com/t5/Data-Science/Alteryx-and-Generative-AI-How-To-Use-the-Alteryx-OpenAI/ba-p/1164000.

 
What is Designer's Workflow Summary tool?

The Workflow Summary Tool is a Designer Desktop tool that uses OpenAI’s GPT API to summarize a workflow’s purpose, inputs, outputs and key logic steps. It can be used to document workflows as they are created or to summarize existing workflows for easier understanding. For more information, please see: https://community.alteryx.com/t5/Data-Science/Introducing-the-Workflow-Summary-Tool-Powered-by-Generative-AI/ba-p/1122275.

 
What is Alteryx AiDIN Co-Pilot?

AiDIN Co-Pilot assists users by enabling the use of natural language prompts to find and configure tools or even build out entire Workflows in Designer desktop. Currently powered by Google Gemini, AiDIN Co-Pilot provides Designer-specific assistance while ensuring that any data you upload for use with Designer is not passed to the LLM; only the input prompts are processed by the LLM. This design maintains the security and privacy of your underlying data, providing compliance with data protection standards. For more information, please see https://community.alteryx.com/t5/Analytics/Introducing-AiDIN-Copilot/ba-p/1266248.

 
What is AI Studio?

Alteryx AI Studio is a no-code platform that can be used to integrate large language models (LLMs) into Alteryx workflows, allowing users to leverage generative AI for tasks such as text mining, content generation, and information retrieval. It provides data security and compliance by allowing organizations to host and manage LLMs within their private environments, or from within an environment hosted by Alteryx. Additionally, Alteryx AI Studio supports model customization by allowing the customer to fine-tune or use retrieval augmented generation (‘RAG”) to augment the LLM’s knowledge, which helps align AI responses with the customer’s specific business needs while maintaining strict governance and control over data usage​.

 
Can I turn off AI Features?

In Designer Desktop, it is your choice whether to download and use the OpenAI Connector or the Workflow Summary tools. In Designer Cloud, our AI chatbot assistant “Fin” will be available to assist you, but you can always ask to speak to a human. In Auto Insights, your administrator has the option to turn off or on the availability of Magic Documents or Playbooks. And for AI Studio, it is an AI service in itself, so if you choose to purchase and use AI Studio, turning AI features off or on is not an option.