You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+13-5Lines changed: 13 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -30,7 +30,7 @@
30
30
31
31
# About
32
32
33
-
Pontoon is an open source, self-hosted, data export platform. We built Pontoon from the ground up for the use case of shipping data products to your enterprise customers. APIs are painful to work with at modern data scale and Pontoon is built to make it easy to sync data directly to your customer's data warehouse. Your customers get their data, without needing to build ETLs or pay for ETL tools, empowering them to make data driven decisions for their business with data from your product. Welcome to the future of customer data syncs 🚀
33
+
Pontoon is an open source, self-hosted, data export platform. We built Pontoon from the ground up for the use case of shipping data products to your enterprise customers. APIs are painful to work with at modern data scale and Pontoon is built to make it easy to sync data directly to your customer's data warehouse. Your customers get their data, without needing to build ETLs or pay for ETL tools, empowering them to make data driven decisions for their business, with data from your product. Welcome to the future of customer data syncs 🚀
34
34
35
35
> Want to get Pontoon up and running in minutes? Try our [Quick Start Guide](https://pontoon-data.github.io/Pontoon/getting-started/quick-start/).
36
36
@@ -43,8 +43,8 @@ Pontoon is an open source, self-hosted, data export platform. We built Pontoon f
43
43
-**🚀 Easy Deployment**: [Get started](https://pontoon-data.github.io/Pontoon/getting-started/quick-start/) in minutes with Docker or deploy to [AWS Fargate](https://pontoon-data.github.io/Pontoon/getting-started/aws-fargate/)
44
44
-**🎯 Major Data Warehouses Integrations**: Supports [Snowflake](https://www.snowflake.com/), [Google BigQuery](https://cloud.google.com/bigquery), [Amazon Redshift](https://aws.amazon.com/redshift/), and [Postgres](https://www.postgresql.org/) as sources and destinations
45
45
-**☁️ Multi-cloud**: Send data from any cloud to any cloud. Amazon Redshift ➡️ Google BigQuery? No problem!
46
-
-**🏗️ Built for Scale**: Sync over 1 million records per minutea
47
-
-**⚡ Automated Syncs**: Schedule data transfers with automatic backfills and incremental loads
46
+
-**⚡ Automated Syncs**: Schedule data transfers with automatic backfills. Incremental loads automatically keep destination datasets in sync
47
+
-**🏗️ Built for Scale**: Sync over 1 million records per minute
48
48
-**✨ Web Interface**: User-friendly dashboard for managing syncs, built with React/Nextjs
49
49
-**🔌 REST API**: Programmatic access to all Pontoon features, built with FastAPI
50
50
@@ -65,7 +65,7 @@ Pontoon solves these problems with:
65
65
66
66
-**Direct Warehouse Integration**: Send data directly to customer's data warehouse. No more ETLs needed!
67
67
-**Scalable Architecture**: Handle millions of rows efficiently. Say goodbye to rate limits!
68
-
-**Scheduled Syncs**: Automate data delivery with automatic backfills on the first sync
68
+
-**Automated Scheduled Syncs**: Automatically keep destinations up to date, only syncing data that's changed/new. Backfills are completed on the first sync.
69
69
-**Self-Hosted**: Full control over your data and infrastructure
To view the Web UI: `localhost:3000`. To view the OpenAPI docs / test the API: `localhost:8000/docs`
99
+
> Note: If you're using CMD or Powershell, run the command in one line, without `\`
100
+
101
+
To view the Web UI: `localhost:3000`. To view the OpenAPI docs / test the API: `localhost:8000/docs`.
102
+
103
+
Check out the [transfer quick start guide](https://pontoon-data.github.io/Pontoon/getting-started/transfer-quick-start/) to add your first source and destination.
100
104
101
105
## Running Pontoon with Docker Compose
102
106
@@ -107,3 +111,7 @@ To build Pontoon from source, use Docker Compose.
107
111
```sh
108
112
docker compose up --build
109
113
```
114
+
115
+
## Creating Your First Data Export
116
+
117
+
To quickly set up your first transfer in Pontoon, follow the steps in the [transfer quick start](https://pontoon-data.github.io/Pontoon/getting-started/transfer-quick-start/) guide!
Copy file name to clipboardExpand all lines: docs/docs/getting-started/transfer-quick-start.md
+8-4Lines changed: 8 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,8 +6,6 @@ Here's how to quickly set up your first transfer in Pontoon. Follow these steps
6
6
7
7
A source defines a connection to a database that holds data you want to sync. For detailed configuration instructions for each source type, see our [Sources & Destinations documentation](../sources-destinations/overview.md).
8
8
9
-
<!-- TODO: Add screenshot of adding a source form -->
10
-
11
9
To add a source:
12
10
13
11
1. Navigate to the Sources page in the Pontoon web interface
@@ -16,6 +14,8 @@ To add a source:
16
14
4. Configure the connection details for your data source
17
15
5. Test the connection to ensure it's working properly
18
16
17
+

18
+
19
19
## Step 2: Adding a Model
20
20
21
21
A model defines a dataset that is ready for export to recipients. Models are multi-tenant, with the `tenant_id` defining which row belongs to which customer.
@@ -30,6 +30,8 @@ When creating a model, you'll need to configure three critical fields:
30
30
31
31
-**Tenant ID**: An identifier used to associate data with specific customers. This same tenant ID will be used when adding recipients, creating the link between your data and who receives it.
32
32
33
+

34
+
33
35
## Step 3: Adding a Recipient
34
36
35
37
Recipients are the customers or organizations that will receive your data. Each recipient is associated with a specific tenant ID.
@@ -43,12 +45,12 @@ To add a recipient:
43
45
3. Enter the recipient's details including their unique tenant ID
A destination defines where your data will be sent. This could be a data warehouse or another database.
49
53
50
-
<!-- TODO: Add screenshot of adding a destination form -->
51
-
52
54
For detailed configuration instructions for each destination type, see our [Sources & Destinations documentation](../sources-destinations/overview.md).
53
55
54
56
To add a destination:
@@ -60,6 +62,8 @@ To add a destination:
60
62
5. Configure the connection details
61
63
6. Test the connection to ensure it's working properly
Once you have configured add a destination, it will kick off an initial sync to backfill data. Click on a destination and navigate to the transfers tab to see the status of the transfer
0 commit comments