In this tutorial, we will connect TagoIO Analysis to OpenAI using the Vercel AI SDK. This SDK is a fantastic tool that standardizes how you interact with AI models. It means you can write clean, standard code today, and easily switch models or providers later without rewriting your entire logic.
By the end of this guide, you will have a script that reads device data and sends it to GPT-5.2 for interpretation.
Prerequisites
- A TagoIO account.
- An OpenAI API Key.
- A device with some data (or a simulated device).
Step 1: Secure Your Credentials
Hardcoding API keys is a security risk. TagoIO Secrets allow you to store sensitive credentials securely and access them only when needed.
- Go to Admin > Secrets in the sidebar.
- Click Add Secret.
- Name the key:
OPENAI_API_KEY. - Paste your actual API key into the value field.
- Click Create Secret.
Now, your analysis can access this key without you ever exposing it in the code.
Step 2: Create the Analysis
We will use the TagoIO Deno runtime. It’s fast, secure, and supports npm packages natively.
- Go to Analysis in the sidebar.
- Click Add Analysis.
- Choose Node.js (Deno is the default runtime for new scripts).
- Name it “AI Data Interpreter” and click Create Analysis.
The Code
Copy and paste the following code into your analysis editor. Or, if you prefer, install from the template here for US or here for Europe.
This script does three things:
- Retrieves your OpenAI key from the environment.
- Fetches the last 1,000 temperature readings from a specific device.
- Sends that data to GPT-5.2 to generate a human-readable summary.
import { generateText } from "npm:ai";
import { createOpenAI } from "npm:@ai-sdk/openai";
import { Analysis, Resources, Utils } from "npm:@tago-io/sdk";
async function startAnalysis(context) {
// 1. Get Environment Variables
const env_vars = Utils.envToJson(context.environment);
if (!env_vars.OPENAI_API_KEY) {
return console.log("Error: OPENAI_API_KEY not found in environment variables.");
}
// 2. Fetch IoT Data
// Replace this ID with your actual Device ID
const DEVICE_ID = "YOUR_DEVICE_ID_HERE";
console.log("Fetching device data...");
const deviceData = await Resources.devices.getDeviceData(DEVICE_ID, {
variables: ["temperature"], // Adjust variable name as needed
qty: 1000,
});
if (!deviceData.length) {
return console.log("No data found for this device.");
}
// 3. Initialize OpenAI Provider
const openaiProvider = createOpenAI({
apiKey: env_vars.OPENAI_API_KEY,
});
console.log("Sending data to AI...");
// 4. Generate AI Analysis
// We stringify the data so the model can read it as context
const { text } = await generateText({
model: openaiProvider("gpt-5.2"),
maxOutputTokens: 2000,
system: "You are an expert industrial IoT analyst. You analyze raw sensor arrays to find anomalies, trends, or efficiency opportunities.",
prompt: `Analyze the following temperature data array and provide a concise health report for the machine. Here is the data: ${JSON.stringify(deviceData)}`,
});
console.log("--- AI Analysis Result ---");
console.log(text);
}
Analysis.use(startAnalysis);
Don’t forget to replace "YOUR_DEVICE_ID_HERE" with the actual ID of your device.
Step 3: Configure Access Policies
For security, TagoIO analysis scripts run in a sandbox. They cannot access your account resources (like Secrets or Device Data) unless you explicitly grant permission.
- Go to Admin > Access Management.
- Click Add Policy.
- Name it “Analysis AI Access”.
- Target:
- Choose Analysis.
- Field: ID.
- Value: Select your “AI Data Interpreter” analysis.
- Permission:
- Rule 1: Allow Secrets.
- Rules: Access.
- Field: ID.
- Value:
OPENAI_API_KEY.
- Rule 2: Allow Device.
- Rules: Access and Get Data.
- Field: ID.
- Value: Select the device you used in the code.
- Rule 1: Allow Secrets.
- Click Create Policy.
Step 4: Run and Test
- Go back to your Analysis.
- Click the Run button (console icon).
Check the Console tab. You should see the script fetch the data and then print a text summary generated by GPT-5.2 describing the trends in your temperature data.
Taking It Further
Now that you have the basic pipeline working, here are a few ways to make this powerful:
- Automated HTML Reports: Instead of just logging the text, ask the AI to format the response as an HTML report. You can then pipe that HTML into the TagoIO PDF Service to email beautiful, AI-generated maintenance reports to your clients.
- Smart Alerts: Set up an Action in TagoIO to trigger this analysis only when a threshold is breached (e.g., temperature > 80°C). The AI can then look at the context around the spike to determine if it was a false alarm or a critical failure.
- Maintenance Suggestions: Pass device metadata (installation date, last service date) along with the sensor readings so the AI can suggest specific maintenance tasks.
The Vercel AI SDK makes swapping models easy, so feel free to experiment with different providers to find the best balance of speed and intelligence for your specific use case.
Let us know in the comments what you build with this!
