Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
6 changes: 6 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -58,6 +58,11 @@ api-client: download
rm -f schemas/gooddata-api-client.json
cat schemas/gooddata-*.json | jq -S -s 'reduce .[] as $$item ({}; . * $$item) + { tags : ( reduce .[].tags as $$item (null; . + $$item) | unique_by(.name) ) }' | sed '/\u0000/d' > "schemas/gooddata-api-client.json"
$(call generate_client,api)
# OpenAPI Generator drops the \x00 literal from regex patterns like ^[^\x00]*$,
# producing the invalid Python regex ^[^]*$. Restore the null-byte escape.
find gooddata-api-client/gooddata_api_client -name '*.py' -exec \
sed -i.bak 's/\^\[\^\]\*\$$/^[^\\x00]*$$/g' {} + && \
find gooddata-api-client/gooddata_api_client -name '*.py.bak' -delete

.PHONY: download
download:
Expand All @@ -66,6 +71,7 @@ download:
$(call download_client,scan)
$(call download_client,"export")
$(call download_client,automation)
$(call download_client,result)

.PHONY: type-check
type-check:
Expand Down
8 changes: 8 additions & 0 deletions docs/content/en/latest/data/data-source/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,14 @@ See [Connect Data](https://www.gooddata.com/docs/cloud/connect-data/) to learn h
* [scan_schemata](./scan_schemata/)
* [scan_sql](./scan_sql/)

### CSV Upload Methods

* [staging_upload](./staging_upload/)
* [analyze_csv](./analyze_csv/)
* [import_csv](./import_csv/)
* [delete_csv_files](./delete_csv_files/)
* [upload_csv](./upload_csv/)


## Example

Expand Down
37 changes: 37 additions & 0 deletions docs/content/en/latest/data/data-source/analyze_csv.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
---
title: "analyze_csv"
linkTitle: "analyze_csv"
weight: 191
superheading: "catalog_data_source."
---



``analyze_csv(location: str)``

Analyzes an uploaded CSV file in the staging area. Returns column metadata, detected types, preview data, and a config object that can be passed to import_csv.

{{% parameters-block title="Parameters"%}}

{{< parameter p_name="location" p_type="string" >}}
Location string returned by staging_upload.
{{< /parameter >}}

{{% /parameters-block %}}

{{% parameters-block title="Returns"%}}

{{< parameter p_type="AnalyzeCsvResponse" >}}
Analysis result with columns, preview data, and config.
{{< /parameter >}}

{{% /parameters-block %}}

## Example

```python
# Analyze a previously uploaded CSV file
analysis = sdk.catalog_data_source.analyze_csv(location="staging/some-location")
for col in analysis["columns"]:
print(f"{col['name']}: {col['type']}")
```
37 changes: 37 additions & 0 deletions docs/content/en/latest/data/data-source/delete_csv_files.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
---
title: "delete_csv_files"
linkTitle: "delete_csv_files"
weight: 193
superheading: "catalog_data_source."
---



``delete_csv_files(data_source_id: str, file_names: list[str])``

Deletes files from a GDSTORAGE data source.

{{% parameters-block title="Parameters"%}}

{{< parameter p_name="data_source_id" p_type="string" >}}
Data source identification string.
{{< /parameter >}}

{{< parameter p_name="file_names" p_type="list[string]" >}}
List of file names to delete.
{{< /parameter >}}

{{% /parameters-block %}}

{{% parameters-block title="Returns" None="yes"%}}
{{% /parameters-block %}}

## Example

```python
# Delete specific files from a GDSTORAGE data source
sdk.catalog_data_source.delete_csv_files(
data_source_id="my-gdstorage-ds",
file_names=["my_table.csv"],
)
```
49 changes: 49 additions & 0 deletions docs/content/en/latest/data/data-source/import_csv.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
---
title: "import_csv"
linkTitle: "import_csv"
weight: 192
superheading: "catalog_data_source."
---



``import_csv(data_source_id: str, table_name: str, location: str, config: Optional[dict] = None)``

Imports a CSV file from the staging area into a GDSTORAGE data source.

{{% parameters-block title="Parameters"%}}

{{< parameter p_name="data_source_id" p_type="string" >}}
Data source identification string.
{{< /parameter >}}

{{< parameter p_name="table_name" p_type="string" >}}
Name for the table to create or replace.
{{< /parameter >}}

{{< parameter p_name="location" p_type="string" >}}
Location string returned by staging_upload.
{{< /parameter >}}

{{< parameter p_name="config" p_type="Optional[dict]" >}}
Source config dict, typically from analyze_csv response. Optional.
{{< /parameter >}}

{{% /parameters-block %}}

{{% parameters-block title="Returns" None="yes"%}}
{{% /parameters-block %}}

## Example

```python
# Import a CSV into a GDSTORAGE data source using config from analysis
analysis = sdk.catalog_data_source.analyze_csv(location=location)
config = analysis.to_dict().get("config")
sdk.catalog_data_source.import_csv(
data_source_id="my-gdstorage-ds",
table_name="my_table",
location=location,
config=config,
)
```
37 changes: 37 additions & 0 deletions docs/content/en/latest/data/data-source/staging_upload.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
---
title: "staging_upload"
linkTitle: "staging_upload"
weight: 190
superheading: "catalog_data_source."
---



``staging_upload(csv_file: Path)``

Uploads a CSV file to the staging area and returns a location string that can be used in subsequent calls to analyze_csv and import_csv.

{{% parameters-block title="Parameters"%}}

{{< parameter p_name="csv_file" p_type="Path" >}}
Path to the CSV file to upload.
{{< /parameter >}}

{{% /parameters-block %}}

{{% parameters-block title="Returns"%}}

{{< parameter p_type="string" >}}
Location string referencing the uploaded file in staging.
{{< /parameter >}}

{{% /parameters-block %}}

## Example

```python
from pathlib import Path

# Upload a CSV file to staging
location = sdk.catalog_data_source.staging_upload(csv_file=Path("data.csv"))
```
44 changes: 44 additions & 0 deletions docs/content/en/latest/data/data-source/upload_csv.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
---
title: "upload_csv"
linkTitle: "upload_csv"
weight: 194
superheading: "catalog_data_source."
---



``upload_csv(data_source_id: str, csv_file: Path, table_name: str)``

Convenience method that uploads a CSV file and imports it into a GDSTORAGE data source in a single call. Orchestrates the full flow: staging_upload → analyze_csv → import_csv → register_upload_notification.

{{% parameters-block title="Parameters"%}}

{{< parameter p_name="data_source_id" p_type="string" >}}
Data source identification string for a GDSTORAGE data source.
{{< /parameter >}}

{{< parameter p_name="csv_file" p_type="Path" >}}
Path to the CSV file to upload.
{{< /parameter >}}

{{< parameter p_name="table_name" p_type="string" >}}
Name for the table to create or replace in the data source.
{{< /parameter >}}

{{% /parameters-block %}}

{{% parameters-block title="Returns" None="yes"%}}
{{% /parameters-block %}}

## Example

```python
from pathlib import Path

# Upload a CSV file end-to-end in a single call
sdk.catalog_data_source.upload_csv(
data_source_id="my-gdstorage-ds",
csv_file=Path("data.csv"),
table_name="my_table",
)
```
31 changes: 31 additions & 0 deletions gooddata-api-client/.github/workflows/python.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
# NOTE: This file is auto generated by OpenAPI Generator.
# URL: https://openapi-generator.tech
#
# ref: https://docs.github.com/en/actions/automating-builds-and-tests/building-and-testing-python

name: gooddata_api_client Python package

on: [push, pull_request]

jobs:
build:

runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]

steps:
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements.txt
pip install -r test-requirements.txt
- name: Test with pytest
run: |
pytest --cov=gooddata_api_client
Loading
Loading