Introduction
Extracting insights from data across multiple platforms—whether CSV files, PostgreSQL, or cloud-based solutions like BigQuery—can be a complex and time-consuming task. Traditionally, users must manually configure databases, define schemas, and write SQL queries just to generate reports. However, Terno AI revolutionizes this process by enabling instant analytics, allowing users to connect, query, and analyze their data effortlessly.
With Terno AI, you can instantly upload CSV files, where the system automatically detects column names, assigns appropriate data types, and structures them into a query-ready format. Its seamless BigQuery integration lets users access real-time insights using natural language instead of complex SQL commands. Additionally, for PostgreSQL users, Terno AI offers a quick and intuitive way to connect NeonDB, enabling instant analytics without manual query writing. Migrating a local PostgreSQL database to NeonDB is also simplified with a guided, step-by-step approach, ensuring a smooth transition to a scalable, cloud-native environment.
Whether you’re working with structured CSV files, leveraging BigQuery for large-scale data analysis, or optimizing your PostgreSQL database for better performance, Terno AI eliminates technical barriers and delivers instant analytics with ease.
How to Upload CSV Files in Terno to get instant analytics
Introduction
Uploading and organizing data can be a hassle, especially when dealing with CSV files. Normally, you would need to create a database table manually before adding data. But what if this process could be automated?
With Terno AI, you can upload a CSV file, and it will automatically generate a table in a database without any manual work. Even if your file doesn’t have column headers, our smart system will figure them out for you!
Here’s How It Works
Whenever you upload a CSV file (e.g., file1.csv, file2.csv), the system follows these steps:
- Understands the File Name: Instead of just using the file name, the system looks at the data inside and gives the table a meaningful name.
- Reads the File Content: It checks if the first row contains column names.
- Creates a Table Schema:
- If column names are present, they are used.
- If not, the system analyzes the data and suggests relevant column names.
- Adds the Data to the Table: Once the table is created, the data from the file is inserted automatically.
How the System Names Tables and Columns
Instead of using generic names, our system assigns meaningful names to tables and columns. For example:
- file1.csv → Becomes → customers
- file2.csv → Becomes → sales_data
- file3.csv → Becomes → product_inventory
Similarly, if your CSV file doesn’t have headers, the system will suggest column names like “id,” “name,” or “price” based on the content. You can also rename these through the Admin page if needed.
Expanding Your Database
Need to add more data? Just upload another CSV file! The system will analyze it, generate a proper table name, extract column details, and add it to your database automatically.
SQL Schema Creation Example
Example 1: CSV File with Headers
Imagine you upload a file called users.csv, which looks like this:
id, name, age
1, John, 30
2, Alice, 25
3, Bob, 35
The system recognizes the first row as column names (id, name, age) and creates a database table like this:
CREATE TABLE users (
id INT,
name VARCHAR(255),
age INT
);
Then, it fills the table with the data from the file.
Example 2: CSV File Without Headers
If your file doesn’t have column names and looks like this:
1, John, 30
2, Alice, 25
3, Bob, 35
The system analyzes the content and assigns meaningful column names based on the values
CREATE TABLE users (
id INT,
name VARCHAR(255),
age INT
);
It ensures that data types are correctly assigned (e.g., numbers as INT and text as VARCHAR).
This way, even if your CSV file lacks headers, the system intelligently structures the table for you.
Conclusion
Terno AI makes it easy to upload and organize CSV files without any technical knowledge. It analyzes your data, creates a well-structured table, and adds the data all automatically. If needed, you can also rename columns later through the Admin panel.
No more manually setting up tables just upload your CSV file and let Terno do the work!
How to Connect Your BigQuery Dataset to Terno to get instant analytics
Introduction
Harnessing data analytics shouldn’t mean navigating technical hurdles that slow you down. Imagine being able to query your data as effortlessly as asking a simple question. With Terno AI’s integration with Google BigQuery, this vision becomes a reality. This easy-to-follow guide will walk you through connecting your BigQuery datasets to Terno AI, enabling you to quickly access powerful insights through intuitive, natural language queries. Whether you’re an analyst, developer, or simply passionate about data, you’ll find this process straightforward and transformative for your workflow.
Connect BigQuery to Terno
Step 1: Enable BigQuery API
Before connecting your dataset, you need to enable the BigQuery API in your Google Cloud project.
- Go to the Google Cloud Console.
- Select your project or create a new one.
- Navigate to APIs & Services → Library.
- Search and enable the BigQuery API.
By enabling this API, your project gains access to BigQuery’s powerful data analytics capabilities.
Step 2: Create a Service Account & Generate Credentials JSON
To allow Text to Query to access your BigQuery dataset, you need to create a Service Account with the appropriate permissions.
- Go to Google Cloud IAM & Admin.
- Select your project.
- Create a Service Account.
- Assign the following role:
- BigQuery Data Viewer (roles/bigquery.dataViewer) – This ensures the service account has read-only access to your dataset.
- In the Service Accounts list, find your newly created service account.
- After completion, go to Actions (⋮), select Manage Keys, and create a new JSON key. Keep this file safe and private!
Step 3: Generate Your BigQuery Connection String
To connect BigQuery to Terno, you need a connection string.
The format of a BigQuery connection string is:
bigquery://<project_id>/<dataset_id>
Where to Find Your Project ID & Dataset ID?
- Project ID:
- Go to Google Cloud Console.
- Click on the project selector (top-left corner) and find your Project ID.
- Dataset ID:
- Go to BigQuery Console.
- In the Explorer Panel, expand your project.
- Locate the dataset you want to connect.
- The dataset name is your Dataset ID.
- Additional Step:
The downloaded JSON needs to be pasted in the BigQuery connection configuration box to add the BigQuery connection database.
Example Connection String
If:
- Project ID = careful-ensign-453007-e0
- Dataset ID = test_sample_data
Then, your BigQuery connection string is:
bigquery://careful-ensign-453007-e0/test_sample_data
Conclusion
Connecting your BigQuery dataset to Text to Query is a straightforward yet powerful step towards unlocking faster, intuitive data exploration. By enabling secure, read-only access through a service account and clearly defining your BigQuery connection string, you can confidently query your data using simple, natural language. Embrace the ease and efficiency of Text to Query integrated with BigQuery, and transform your data analytics experience today.
How to Connect NeonDB to Terno AI to get instant analytics
Introduction
Managing databases shouldn’t involve complicated SQL queries. NeonDB, a powerful cloud-native PostgreSQL database, integrates effortlessly with Terno AI’s Text-to-Query, letting you query data using natural language. Let’s quickly set it up!
Step 1: Retrieve NeonDB Connection String
- Log into your NeonDB account.
- Choose your project and copy your PostgreSQL connection string (it looks like this):
postgres://<username>:<password>@<host>:<port>/<database>?sslmode=require
Ensure SSL mode is activated.
Step 2: Add NeonDB to Terno AI
- Go to Terno AI’s Admin Portal.
- Click Add Data Source.
- Choose Postgres as the data source type.
- Paste your NeonDB connection string.
- Enter postgres as the dialect name.
- Click Save and test your connection.
Conclusion
Linking NeonDB with Terno AI simplifies database interactions, enabling powerful natural-language queries without writing SQL. Boost your productivity and experience seamless data management now!
How to Connect Your Local PostgreSQL Database to Terno to get instant analytics
Introduction
In the era of AI-driven applications, having a scalable and cloud-based database is essential for efficient text-to-query processing. If you’re currently using a local PostgreSQL database and want to connect to Terno, you will have to migrate your PostgreSQL DB to NeonDB, a serverless PostgreSQL solution, this guide will walk you through the process step-by-step.
By the end of this guide, you will have your database fully migrated and integrated into your Terno system, allowing seamless query execution in a cloud environment.
Step 1: Backup Your Local PostgreSQL Database
Before migrating to NeonDB, create a backup of your local PostgreSQL database using pg_dump. This ensures all your data is safely stored and ready for transfer.
Run the following command in your terminal:
pg_dump -U your_local_user -d your_local_db -h localhost -p 5432 -F c -f backup.dump
Explanation:
- -U your_local_user → Your PostgreSQL username
- -d your_local_db → The name of your local database
This will create a backup file (backup.dump) containing your database schema and data.
Step 2: Restore Backup to NeonDB
Now that you have a backup of your local PostgreSQL database, it’s time to restore it into NeonDB.
Run the following command to restore your database:
pg_restore -U neon_user -d neon_db_name -h neon_host -p 5432 -F c --clean --no-owner backup.dump
If this is your connection string :
postgresql://neondb_owner:[YOUR PASSWORD]@ep-broad-feather-ab44q915-pooler.eu-west-2.aws.neon.tech/neondb?sslmode=require
You have to run this command:
pg_restore -U neondb_owner \
-d neondb \
-h ep-broad-feather-ab44q915-pooler.eu-west-2.aws.neon.tech \
-p 5432 \
--no-owner --clean backup.dump
After running this command you will be prompted to password enter your neon db password.
Explanation:
- -U neondb_owner → Your NeonDB username
- -d neondb → The target NeonDB database name
- -h ep-broad-feather-ab44q915-pooler.eu-west-2.aws.neon.tech → The hostname for your NeonDB instance
After executing this command, your database will be successfully transferred to NeonDB.
Step 3 : Follow the steps of how to connect NeonDB to Terno.
Conclusion
Terno AI revolutionizes data handling by automating CSV file processing, enabling natural language queries for BigQuery, simplifying NeonDB integration, and providing a hassle-free migration path from local PostgreSQL databases to the cloud. Instead of dealing with manual table creation, complex SQL queries, and schema definitions, users can focus on gaining insights from their data.
By offering AI-powered automated table creation, smart schema detection, and seamless cloud database connectivity, Terno AI eliminates technical barriers and enhances productivity. Whether you’re an analyst, developer, or business professional, you can now interact with your data effortlessly, without deep technical expertise.
With Terno AI, you no longer need to worry about database complexities just upload, connect, and query. Embrace the future of AI-powered data management and unlock new possibilities with Terno AI today.