Alteryx Artificial Intelligence

Frequently Asked Questions

 

AI Features: General

 
Do Alteryx Products and Services include AI?

Alteryx products may include AI features. We may also give the user the ability to connect to AI provided by another vendor, such as OpenAI. For example, our Alteryx Designer software can use the OpenAI Connector and the Workflow Summary tools. These tools allow users to connect to OpenAI with the user’s own OpenAI API. For our cloud products, such as Auto Insights, we may include AI features such as Playbooks and Magic Documents, which connect to Microsoft Azure Cognitive Services. For more information, please see the documentation (https://help.alteryx.com) for the specific product you’re using or are interested in using.

 
When I connect with your customer support will I be talking to a human or chatbot?

When you contact our customer support, you will be talking to a human. In our cloud products, such as Alteryx Designer Cloud, we offer an AI chatbot named “Fin.” Fin can be used to get quick answers to common questions. If at any time a user would rather speak with a human, we let the users know that they can be connected to a customer service representative at any time.

 
How does Alteryx notify customers when AI features are included in the products license?

As AI features are introduced into our products, we will provide relevant information directly to the users through the user interface within our products. We will also include information about these features in our product release notes and in updates our product documentation. To see our product release notes and documentation, please go to https://help.alteryx.com/.

 
How will we know when you modify one of your AI features?

When we modify an AI feature in one of our products, we will provide information about that change in our product release notes.  You can see these release notes at https://help.alteryx.com/.

 

AI Features and Your Data

 
Do you train your AI on my data?

We do not train our AI models on customer data or personal data collected from users of our software.

 
Who owns and has access to the output generated from your AI?

As between you (the customer) and Alteryx, you own any original output generated from the AI features provided by Alteryx to you. Ownership of non-original output remains unchanged. For instance, if an output includes a quote from a published research article, any copyright associated with that quote remains with the author. You should also be aware that ownership rights may be affected by the content of inputs you use to produce the output. Similar inputs that you or someone else use may produce similar, non-original outputs, particularly so if the LLM temperature settings (i.e. the setting that determines how creative the output will be) are at lower settings. You should always check to verify the originality of your outputs if you plan to claim any type of ownership rights.

 
Do you share our AI prompts with third parties?

We do not share a Customer’s AI prompts with third parties. However, if the AI feature you are using connects you to a third party hosted LLM, such as OpenAI’s ChatGPT or a Google Gemini, your prompts will be processed by those LLMs. Please refer to the third party’s terms of use associated with those LLMs.

 
Can your AI model be customized to our specific needs?

For products such as Playbooks and Magic Documents in Auto Insights, the AI model is not customized, but our products are designed to help you apply the power of the LLM to gain valuable insights and help you better understand your data. For Alteryx Copilot, we will customize that AI Feature with Alteryx data to provide you insights and assistance to build better workflows more efficiently.

 
Where does my data go when I prompt a model?

Your prompt will be sent to the third party LLM in order to generate a response from the LLM. Please refer to the data handling information provided by that third party.

 

Transparency and Explainability

 
Do you train your own LLMs or do you use third party LLMs?

For some services we let you use your own API to connect to services like OpenAI. Examples include the OpenAI Connector and Workflow Summary tools in Designer. For services like our Magic Documents and Playbooks in Auto Insights, we connect to OpenAI through Microsoft Azure Cognitive Services. For Alteryx Copilot, we currently use Google Gemini. So, in those instances, OpenAI and Google are responsible for training their LLMs. Your data used by Magic Documents and Playbooks in Auto Insights is not used to improve Azure OpenAI Models. For more information, go to https://learn.microsoft.com/en-us/legal/cognitive-services/openai/data-privacy.

 
Can I choose which LLMs I can use?

For Auto Insights, our AI features currently use Microsoft Azure Cognitive Services. Alteryx Copilot currently uses Google Gemini. For our Designer software, we make available tools, such as the OpenAI Connector and Workflow Summaries that allow you to connect to OpenAI through your OpenAI API. You may also design your own connectors using the functionality available in Designer to connect to other AI-based services. Please visit the product documentation (https://help.alteryx.com) to learn more about the LLMs in use or the other options available to you.

 
Where are the LLMs hosted?

We use third party LLMs (Open AI through Microsoft Azure Cognitive Services or Google Gemini).

 
How are your LLMs updated and improved over time?

We use third party LLMs, such as Microsoft Cognitive Services (using OpenAI) and Google Gemini. As such, the extent and timing of any updates to these LLMs will be up to the third party providing the LLM.

 
Is the output generated by your AI accurate?

We develop our AI features to be as accurate as possible. Accuracy depends on many factors, such as the quality of the input or prompt used to create the AI output, the LLM chosen, the temperature settings when using the AI, and other factors. As a result, accuracy can never be assured. Regardless of what AI you choose or where you get it from, you should always verify the accuracy of any output. We recommend building processes and procedures to consistently verify AI output.

 

Trust and Accountability/Reliability and Safety

 
How do you assess the risks of using AI in your products and services?

Alteryx conducts comprehensive risk assessments and audits of AI technologies to mitigate potential biases or inaccuracies, incorporating feedback loops for ongoing improvement.

 
How do you address potential bias or discrimination in LLMs?

Whenever a third party LLM is used, such as OpenAI, you should investigate the third party’s practices to address potential bias. As for Alteryx, we are committed to making AI technologies that are free from unfair discrimination or bias. Techniques such as diverse dataset training, algorithmic fairness checks, and continuous model evaluation will be employed to address bias and discrimination. Please visit or Alteryx Responsible AI Principles website at https://www.alteryx.com/trust/ai-principles for more information about our commitment to responsible AI practices.

 
What is prompt injection and how is it addressed in your solution?

Prompt injection is when an attacker enters a text prompt or instructs the AI to retrieve certain external data intended to give the attacker the ability to perform unauthorized actions. In the case of AI, prompt injection might let the attacker make the AI say anything they want it to say, and can make the AI ignore previous instructions in favor of later ones. At Alteryx, we have several safeguard mechanisms in place to prevent such scenarios. These include stringent input validation procedures and the use of AI models that can detect and filter out potentially harmful instructions.

 

AI Features in our Products

 
What is "AiDIN"?

AiDIN is the collection of AI functionality available from Alteryx. To learn more about AiDIN, please visit https://www.alteryx.com/aidin.

 
What are Auto Insights Playbooks?

Playbooks automate the manually intensive process of identifying, selecting, and building new analytics use cases for end users. Playbooks saves time and effort for business stakeholders and analytics professionals alike through automated use case generation, and automated sample dataset generation. Playbooks execute several API calls as the user browses through the Playbooks interface to generate use cases, synthetic data, reports, and more. For Playbooks with synthetic data, Auto Insights only sends the information the user enters into the text box (for example, role, company, department, or problem space) and the use case and report selection (both generated by AI) to Microsoft Cognitive Services (which uses OpenAI). For Playbooks with your real data, Auto Insights only sends the structure and randomized sample values of datasets used to generate use cases as well as the use case and report selection (both generated by AI) to OpenAI. All data is encrypted in transit and at rest within and between Alteryx Auto Insights and Microsoft Azure Cognitive Services. For more information, go to the https://learn.microsoft.com/en-us/legal/cognitive-services/openai/data-privacy.

 
What are Auto Insights Magic Documents?

Magic Documents is an Auto Insights feature that uses generative AI to generate dynamic content. Magic Documents work by connecting to Open AI via Microsoft Azure Cognitive Services. Your Organization Administrator can turn off or on the Magic Documents feature in the Admin Portal under the Feature Access tab. For information on how Magic Documents uses your data, go to the Usage of Magic Documents and Data Security article (https://help.alteryx.com/aac/en/auto-insights/data-security/data-security-when-using-magic-documents-or-playbooks.html). Auto Insights only sends the findings that are presented in the Mission Summary, as well as configurations for each document that you select, such as:

  • Format: Email, Presentation, or Message
  • Audience
  • Include or exclude recommendations
  • Objective
  • Tone
  • Language
  • Length of the returned document
  • Topic
 
What is the Designer OpenAI Connector?

OpenAI connector enables generative AI to be embedded in Alteryx Designer workflows. The OpenAI connector is a Designer Desktop tool that connects to OpenAI through your provided OpenAI API to generate natural language outputs from user data and analytics. For more information, please see: https://community.alteryx.com/t5/Data-Science/Alteryx-and-Generative-AI-How-To-Use-the-Alteryx-OpenAI/ba-p/1164000.

 
What is Designer's Workflow Summary tool?

The Workflow Summary Tool is a Designer Desktop tool that uses OpenAI’s GPT API to summarize a workflow’s purpose, inputs, outputs and key logic steps. It can be used to document workflows as they are created or to summarize existing workflows for easier understanding. For more information, please see: https://community.alteryx.com/t5/Data-Science/Introducing-the-Workflow-Summary-Tool-Powered-by-Generative-AI/ba-p/1122275.

 
What is Alteryx Copilot?

Alteryx Copilot assists users by enabling the use of natural language prompts to find and configure tools or even build out entire Workflows in Designer desktop. Currently powered by Google Gemini, Alteryx Copilot provides Designer-specific assistance while ensuring that any data you upload for use with Designer is not passed to the LLM; only the input prompts are processed by the LLM. This design maintains the security and privacy of your underlying data, providing compliance with data protection standards. For more information, please see https://community.alteryx.com/t5/Analytics/Introducing-AiDIN-Copilot/ba-p/1266248.

 
Can I turn off AI Features?

In Designer Desktop, it is your choice whether to download and use the OpenAI Connector or the Workflow Summary tools. In Designer Cloud, our AI chatbot assistant “Fin” will be available to assist you, but you can always ask to speak to a human. In Auto Insights, your administrator has the option to turn off or on the availability of Magic Documents or Playbooks. And for AI Studio, it is an AI service in itself, so if you choose to purchase and use AI Studio, turning AI features off or on is not an option.