Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
116 changes: 116 additions & 0 deletions claude.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,110 @@
- Keep meta descriptions under 200 characters
- Keep meta titles under 60 characters

## Variables and snippets

### Variable import hierarchy
Variables follow a bottom-up import hierarchy:
- Snippets import their own variables directly
- Parent files MUST NOT duplicate variable imports that come from their snippets
- Only import variables in the main file if they are used directly in that file's content

### Checking for duplicate imports
Before finalizing any file migration, systematically check for duplicate variable imports:

1. **Read all imported snippets** to see what variables they import
2. **Compare with main file imports** to identify duplicates
3. **Remove duplicates from main file** - the snippet's variables are automatically available
4. **Verify all text** in the main file uses variables (not plain text)

Example workflow:
```
Main file uses: {CLOUD_LONG}, {SERVICE_LONG}
Snippet A imports: SERVICE_LONG, CONSOLE
Snippet B imports: CLOUD_LONG, VPC

Result: Main file should import NOTHING - all variables come from snippets
```

### Variable application checklist
For EVERY file migration, systematically check the vars.mdx file and apply ALL relevant variables:

**Core product variables:**
- CLOUD_LONG, SERVICE_LONG, SERVICE_SHORT, SELF_LONG, SELF_LONG_CAP
- CONSOLE, TIMESCALE_DB, PG, COMPANY

**Feature variables:**
- HYPERTABLE, HYPERTABLE_CAP (for "hypertable(s)" / "Hypertable(s)")
- HYPERCORE, HYPERCORE_CAP (for "hypercore" / "Hypercore")
- COLUMNSTORE, COLUMNSTORE_CAP (for "columnstore" / "Columnstore")
- ROWSTORE, ROWSTORE_CAP (for "rowstore" / "Rowstore")
- CAGG, CAGG_CAP (for "continuous aggregate(s)" / "Continuous aggregate(s)")
- MAT_HYPERTABLE, MAT_HYPERTABLE_CAP (for "materialized hypertable(s)")
- VPC (for "VPC")

**Pricing variables:**
- PRICING_PLAN, SCALE, ENTERPRISE

**Process:**
1. Read snippets/vars.mdx to see all available variables
2. Search the file content for terms that match variable values
3. Replace ALL occurrences with variables
4. Check that variables aren't imported twice (main file + snippets)

### Common patterns

**Pattern 1: Integration files**
```mdx
---
title: Integrate [Tool] with Tiger Cloud
sidebarTitle: [Tool]
description: [Tool description]
---

import IntegrationPrereqs from '/snippets/prerequisites/_integration-prereqs.mdx';
import OtherSnippet from '/snippets/path/_snippet.mdx';

[Tool][tool-link] does something with {SERVICE_LONG}.

## Prerequisites

<IntegrationPrereqs />

## Connect

Instructions with {CLOUD_LONG} and {SERVICE_SHORT} variables...

[tool-link]: https://example.com
```

Main file imports: NONE (if all variables come from snippets) or only those used directly in content

**Pattern 2: Snippet files**
```mdx
import { SERVICE_LONG, CONSOLE, VPC } from '/snippets/vars.mdx';

Content using {SERVICE_LONG}, {CONSOLE}, and {VPC}...
```

Snippets import only what they use directly

**Pattern 3: Nested snippets**
If snippet A imports snippet B:
- Snippet B imports its own variables
- Snippet A only imports variables it uses directly (not from B)
- Main file that imports snippet A gets variables from both A and B automatically

### Template literal syntax for Tab titles
When using variables in component props (like Tab titles), use template literal syntax:
```mdx
<Tab title={`${CLOUD_LONG}`}>
```

NOT:
```mdx
<Tab title="{CLOUD_LONG}">
```

## Git workflow
- NEVER use --no-verify when committing
- Ask how to handle uncommitted changes before starting
Expand All @@ -101,8 +205,20 @@

# Migration

## Critical migration requirements

**ALWAYS follow these steps for EVERY file migration:**

1. **Apply ALL relevant variables** - Systematically check snippets/vars.mdx and apply every applicable variable (see "Variables and snippets" section above)
2. **Check for duplicate imports** - Read all imported snippets to see what variables they import, then ensure the main file doesn't duplicate those imports
3. **Verify variable usage** - Ensure all content uses variables, not plain text for product names and features

## Migration steps

- Check the directory that the files are to move into
- Update all ${VARIABLES} to use the mintlify variables (reference snippets/vars.mdx for mappings)
- **CRITICAL**: After migration, check that variables are not imported twice (see "Variables and snippets" section)
- **Remove all unsupported tags** - Delete tags like `<Tag>`, `<Procedure>`, and other custom components that are not supported in Mintlify
- replace references to import since`<version>` with `<Icon icon="circle-play" iconType="duotone" />` Since `<version>` on its own line after the frontmatter, followed by a newline before content begins
- replace references to import deprecated`<version>` with `<Icon icon="circle-pause" iconType="duotone" />` Deprecated `<version>` on its own line after the frontmatter, followed by a newline before content begins
- replace references to import DeprecationNotice with `<Icon icon="circle-pause" iconType="duotone" />` Deprecated on its own line after the frontmatter, followed by a newline before content begins
Expand Down
74 changes: 62 additions & 12 deletions docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -306,19 +306,19 @@
{
"group": "Time-series",
"pages": [
"tutorials/real-time-analytics/energy-consumption",
"tutorials/real-time-analytics/transport",
"tutorials/time-series/query-bitcoin-blockchain",
"tutorials/time-series/health-biometical-analysis",
"tutorials/time-series/anomaly-detection-in-streaming-data"
"tutorials/time-series/analyze-blockchain",
"tutorials/time-series/analyze-energy-consumption",
"tutorials/time-series/analyze-financial-tick-data",
"tutorials/time-series/analyze-transport-geospatial-data",
"tutorials/time-series/ingest-real-time-financial-data",
"tutorials/time-series/query-blockchain",
"tutorials/time-series/simulate-iot-sensor-data"
]
},
{
"group": "AI",
"pages": [
"tutorials/ai/financial-forcasting",
"tutorials/ai/powered-observability",
"tutorials/ai/real-time-predictive-maintenence"
"tutorials/ai/build-semantic-search-application"
]
}
]
Expand All @@ -328,7 +328,10 @@
"groups": [
{
"group": " ",
"pages": ["integrations/integrations"]
"pages": [
"integrations/integrations",
"integrations/find-connection-details"
]
},
{
"group": "Destination connectors",
Expand All @@ -338,13 +341,21 @@
"group": "Source connectors",
"pages": [
"integrations/connectors/source/sync-from-postgres",
"integrations/connectors/source/sync-from-s3"
"integrations/connectors/source/sync-from-s3",
"integrations/connectors/source/stream-from-kafka"
]
},
{
"group": "Coding",
"pages": ["integrations/code/start-coding-with-tigerdata"]
},
{
"group": "Business intelligence and data visualization",
"pages": [
"integrations/integrate/power-bi",
"integrations/integrate/tableau"
]
},
{
"group": "Configuration and deployment",
"pages": [
Expand Down Expand Up @@ -374,7 +385,7 @@
"integrations/integrate/datadog",
"integrations/integrate/grafana",
"integrations/integrate/prometheus",
"integrations/integrate/tableau"
"integrations/integrate/telegraf"
]
},
{
Expand All @@ -385,7 +396,8 @@
"integrations/integrate/pgadmin",
"integrations/integrate/postgresql",
"integrations/integrate/psql",
"integrations/integrate/qstudio"
"integrations/integrate/qstudio",
"integrations/integrate/supabase"
]
},
{
Expand All @@ -396,6 +408,10 @@
"integrations/integrate/google-cloud",
"integrations/integrate/microsoft-azure"
]
},
{
"group": " ",
"pages": ["integrations/troubleshooting"]
}
]
},
Expand Down Expand Up @@ -530,6 +546,40 @@
}
]
},
"redirects": [
{
"source": "/tutorials/latest/real-time-analytics-energy-consumption/:slug*",
"destination": "/tutorials/time-series/analyze-energy-consumption"
},
{
"source": "/tutorials/latest/blockchain-analyze/:slug*",
"destination": "/tutorials/time-series/analyze-blockchain"
},
{
"source": "/tutorials/latest/blockchain-query/:slug*",
"destination": "/tutorials/time-series/query-blockchain"
},
{
"source": "/tutorials/latest/simulate-iot-sensor-data/:slug*",
"destination": "/tutorials/time-series/simulate-iot-sensor-data"
},
{
"source": "/tutorials/latest/real-time-analytics-transport/:slug*",
"destination": "/tutorials/time-series/analyze-transport-geospatial-data"
},
{
"source": "/tutorials/latest/financial-tick-data/:slug*",
"destination": "/tutorials/time-series/analyze-financial-tick-data"
},
{
"source": "/tutorials/latest/financial-ingest-real-time/:slug*",
"destination": "/tutorials/time-series/ingest-real-time-financial-data"
},
{
"source": "/tutorials/latest/:slug*",
"destination": "/tutorials/tutorials"
}
],
"contextual": {
"options": ["claude", "chatgpt"]
},
Expand Down
45 changes: 21 additions & 24 deletions integrations/code/start-coding-with-tigerdata.mdx
Original file line number Diff line number Diff line change
@@ -1,54 +1,51 @@
---
title: Start coding with Tiger Data
description: Integrate Tiger Cloud with your app using your preferred programming language. Connect to a service,
create and manage hypertables, then and ingest and query data
products: [cloud, self_hosted, mst]
description: Integrate Tiger Cloud with your app using your preferred programming language. Connect to a service, create and manage hypertables, then ingest and query data
keywords: [coding, programming, SDKs, client libraries, Python, Node.js, Java, Ruby, Golang, database drivers, application integration]
---

import StartCodingRuby from "/snippets/coding/_start-coding-ruby.mdx";
import StartCodingPython from "/snippets/coding/_start-coding-python.mdx";
import StartCodingNode from "/snippets/coding/_start-coding-node.mdx";
import StartCodingGoLang from "/snippets/coding/_start-coding-golang.mdx";
import StartCodingJava from "/snippets/coding/_start-coding-java.mdx";
import StartCodingRuby from '/snippets/integrations/code/_start-coding-ruby.mdx';
import StartCodingPython from '/snippets/integrations/code/_start-coding-python.mdx';
import StartCodingNode from '/snippets/integrations/code/_start-coding-node.mdx';
import StartCodingGoLang from '/snippets/integrations/code/_start-coding-golang.mdx';
import StartCodingJava from '/snippets/integrations/code/_start-coding-java.mdx';

Easily integrate your app with Tiger Cloud or self-hosted TimescaleDB. Use your favorite programming language to connect to your
service, create and manage hypertables, then ingest and query data.

Easily integrate your app with {CLOUD_LONG} or {SELF_LONG}. Use your favorite programming language to connect to your
{SERVICE_LONG}, create and manage hypertables, then ingest and query data.
<Tabs>

<Tabs label="Start coding with Tiger Data" >

<Tab title="Ruby" >
<Tab title="Ruby">

<StartCodingRuby />

</Tab>
<Tab title="Python" >

<Tab title="Python">

<StartCodingPython />

</Tab>
<Tab title="Node.js" >

<Tab title="Node.js">

<StartCodingNode />

</Tab>
<Tab title="Go" >

<Tab title="Go">

<StartCodingGoLang />

</Tab>
<Tab title="Java" >

<Tab title="Java">

<StartCodingJava />

</Tab>

</Tabs>



You are not limited to these languages. {CLOUD_LONG} is based on {PG}, you can interface
with {TIMESCALE_DB} and {CLOUD_LONG} using any [{PG} client driver][postgres-drivers].


[postgres-drivers]: https://wiki.postgresql.org/wiki/List_of_drivers
You are not limited to these languages. Tiger Cloud is based on {PG}, you can interface
with TimescaleDB and Tiger Cloud using any [{PG} client driver](https://wiki.postgresql.org/wiki/List_of_drivers).
Loading