Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions RUNNING-DOCS-LOCALLY.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,9 @@ To run these docs locally you'll need:

If you want to fully render all documentation locally you need to install the following plugins with `pip install`:

* [glightbox](https://pypi.org/project/mkdocs-glightbox/0.1.0/)
* [multirepo](https://pypi.org/project/mkdocs-multirepo/)
* [redirects](https://pypi.org/project/mkdocs-redirects/)
* [mkdocs-glightbox](https://pypi.org/project/mkdocs-glightbox/0.1.0/)
* [mkdocs-multirepo](https://pypi.org/project/mkdocs-multirepo/)
* [mkdocs-redirects](https://pypi.org/project/mkdocs-redirects/)

You also need to sign up to the [Insiders Programme](https://squidfunk.github.io/mkdocs-material/insiders/).

Expand Down
28 changes: 8 additions & 20 deletions docs/apis/data-catalogue-api/intro.md
Original file line number Diff line number Diff line change
@@ -1,37 +1,25 @@
# Introduction

The Data Catalogue HTTP API allows you to fetch data stored in the Quix
platform. You can use it for exploring the platform, prototyping
applications, or working with stored data in any language with HTTP
capabilities.
The Data Catalogue HTTP API allows you to fetch data stored in the Quix platform. You can use it for exploring the platform, prototyping applications, or working with stored data in any language with HTTP capabilities.

The API is fully described in our [Swagger
documentation](get-swagger.md). Read on for
a guide to using the API, including real-world examples you can execute
from your language of choice, or via the command line using `curl`.
The API is fully described in our [Swagger documentation](get-swagger.md). Read on for a guide to using the API, including real-world examples you can invoke from your language of choice, or using the command line using `curl`.

## Preparation

Before using any of the endpoints, you’ll need to know how to
[authenticate your requests](authenticate.md) and
how to [form a typical request to the
API](request.md).
Before using any of the endpoints, you’ll need to know how to [authenticate your requests](authenticate.md) and how to [form a typical request to the API](request.md).

You’ll also need to have some data stored in the Quix platform for API
use to be meaningful. You can use any Source from our [Code Samples](../../platform/samples/samples.md) to do this using the Quix
portal.
You’ll also need to have some data stored in the Quix platform for API use to be meaningful. You can use any Source from our [Code Samples](../../platform/samples/samples.md) to do this using the Quix portal.

## Further documentation

| | | |
| ------------------------------------------------------------------ | ------------------ | ----------------------------------------- |
| Documentation | Endpoint | Examples |
| Documentation | Endpoint | Examples |
| -------------------------------------------- | ------------------ | ----------------------------------------- |
| [Streams, paged](streams-paged.md) | `/streams` | Get all streams in groups of ten per page |
| [Streams, filtered](streams-filtered.md) | `/streams` | Get a single stream, by ID |
| | | Get only the streams with LapNumber data |
| | | Get only the streams with LapNumber data |
| [Streams & models](streams-models.md) | `/streams/models` | Get stream hierarchy |
| [Raw data](raw-data.md) | `/parameters/data` | Get all the `Speed` readings |
| | | Get `Speed` data between timestamps |
| | | Get `Speed` data between timestamps |
| [Aggregated data by time](aggregate-time.md) | `/parameters/data` | Downsample or upsample data |
| [Aggregated by tags](aggregate-tags.md) | `/parameters/data` | Show average Speed by LapNumber |
| [Tag filtering](filter-tags.md) | `/parameters/data` | Get data for just one Lap |
13 changes: 4 additions & 9 deletions docs/apis/streaming-writer-api/intro.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,9 @@
# Introduction

The Streaming Writer API allows you to stream data into the Quix
platform via HTTP endpoints or SignalR. It’s an alternative to using our
C\# and Python client libraries. You can use the Streaming Writer API from any
HTTP-capable language.

The API is fully documented in our [Swagger
documentation](get-swagger.md). Read on for a
guide to using the API, including real-world examples you can execute
from your language of choice, or via the command line using curl.
The Streaming Writer API allows you to stream data into the Quix platform via HTTP endpoints or SignalR. It’s an alternative to using our C# and Python client libraries. You can use the Streaming Writer API from any HTTP-capable language.

The API is fully documented in our [Swagger documentation](get-swagger.md). Read on for a guide to using the API, including real-world examples you can invoke
from your language of choice, or using the command line using curl.

## Preparation

Expand Down
2 changes: 1 addition & 1 deletion docs/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -117,7 +117,7 @@ Read more about the Quix Streams Client Library and APIs.

---

Query historic time-series data in Quix using HTTP interface.
Query historical time-series data in Quix using HTTP interface.

[:octicons-arrow-right-24: Learn more](./apis/data-catalogue-api/intro.md)

Expand Down
6 changes: 3 additions & 3 deletions docs/platform/MLOps.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,18 +19,18 @@ a seamless journey from concept to production. The key steps are:
Any member of any team can quickly access data in the Catalogue without
support from software or regulatory teams.

## Develop features in historic data
## Develop features in historical data

Use Visualise to discover, segment, label and store significant features
in the catalogue.

## Build & train models on historic data
## Build & train models on historical data

Use Develop and Deploy to:

- Write model code in Python using their favourite IDE.

- Train models on historic data.
- Train models on historical data.

- Evaluate results against raw data and results from other models.

Expand Down
4 changes: 2 additions & 2 deletions docs/platform/definitions.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Workspaces are collaborative. Multiple users, including developers, data scienti

## Project

A set of code in Quix Platform that can be edited, compiled, executed, and deployed as one Docker image. Projects in Quix Platform are fully version controlled. You can also tag your code as an easy way to manage releases of your project.
A set of code in Quix Platform that can be edited, compiled, run, and deployed as one Docker image. Projects in Quix Platform are fully version controlled. You can also tag your code as an easy way to manage releases of your project.

## Deployment

Expand Down Expand Up @@ -153,7 +153,7 @@ A [WebSockets API](../apis/streaming-reader-api/intro.md) used to stream any dat

### Data Catalogue API

An [HTTP API](../apis/data-catalogue-api/intro.md) used to query historic data in the Data Catalogue. Most commonly used for dashboards, analytics and training ML models. Also useful to call historic data when running an ML model, or to call historic data from an external application.
An [HTTP API](../apis/data-catalogue-api/intro.md) used to query historical data in the Data Catalogue. Most commonly used for dashboards, analytics and training ML models. Also useful to call historical data when running an ML model, or to call historical data from an external application.

### Portal API

Expand Down
2 changes: 1 addition & 1 deletion docs/platform/how-to/jupyter-nb.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ You need to be logged into the platform for this:

![how-to/jupyter-wb/connect-python.png](../../platform/images/how-to/jupyter-wb/connect-python.png)

Copy the Python code to your Jupyter notebook and execute.
Copy the Python code to your Jupyter notebook and run.

![how-to/jupyter-wb/jupyter-results.png](../../platform/images/how-to/jupyter-wb/jupyter-results.png)

Expand Down
2 changes: 1 addition & 1 deletion docs/platform/how-to/webapps/write.md
Original file line number Diff line number Diff line change
Expand Up @@ -237,7 +237,7 @@ req.end();
```

In the preceding example, tags in the event data request are optional.
Tags add context to your data points and help you to execute efficient
Tags add context to your data points and help you to run efficient
queries over them on your data like using indexes in traditional
databases.

Expand Down
92 changes: 69 additions & 23 deletions docs/platform/security/security.md
Original file line number Diff line number Diff line change
@@ -1,37 +1,83 @@
# Security
# Cloud Security Principles

This section describes the basic security features of Quix.
## Introduction

## Data in flight
Quix's mission is to free developers to build, test and run next-gen applications without the hassle of managing complex technologies. We believe that we must make your data secure and that protecting it is one of our most important responsibilities. We're committed to being transparent about our security practices and helping you understand our approach.

### Authentication
Quix includes a robust set of security and data protection product features that give you the control, visibility and flexibility you need to manage all your security challenges, without compromising agility.

- Our APIs are authenticated using
[OAuth 2.0](https://datatracker.ietf.org/doc/html/rfc6749){target=_blank} token. We
are using [Auth0](https://auth0.com/docs/protocols/protocol-oauth2){target=_blank}
as our provider.
This document outlines how Quix helps customers configure, deploy and use the cloud service securely.

- Each Kafka server is authenticated using certificate, which is
provided for each project created and can also be downloaded from
topics view. The client is authenticated using SASL (username,
password).
## Authentication

### Authorization
Securing your information starts with identity controls, no matter where your users are located. Quix allows you to manage users, streamline authentication using your identity provider, and assign roles. We give you the solutions to ensure that only the right people can access your company's information in Quix.

- The APIs is using RBAC. You are limited in what you can do based on
your token and the role configured for your user.
OAuth is the protocol Quix uses when you auth against our platform using Google or your preferred Identity provider. Customers are responsible for integrating and managing their identity provider (for single sign-on and provisioning) as well as assigning roles in Quix.

- Each kafka client is authrozied to only read and write to the topics
or query consumer group information regarding topics owned by the
organization the client belongs to.
Data in flight is protected with Authentication (OAuth 2.0 tokens, SASL, SSL Certificates), Authorisation (RBAC) and Encryption (TLS 1.2).

### Two-Factor Authentication

Multi-factor authentication splits channels of an authentication process, rendering just one compromised system or device to be insufficient for use for unauthorised access. It is a widely used technique with a proven record.

Customers may use Two-factor authentication to access Quix. This is welcome and encouraged, but it needs to be enforced. To request it you can write to [[email protected]](mainto:[email protected]).

## Data security

By default, Quix encrypts data at rest and data in transit as part of our foundational security controls. We also provide tools that give you even further protection and control.

### Encryption

- All our APIs communicate with TLS 1.2
Our preferred encryption is TLS 1.3, with 1.2 allowed as a fallback. We don't support TLS 1.1 or older in any part of our platform.

Older cyphers are cryptographically unsafe. We do not serve or support unsafe and weak cyphers, ensuring that our posture is in line with your standards and expectations.

#### Encryption of Data in Flight

Your connection to Quix is secured by the latest in TLS. We also use certificates to encrypt our in-flight traffic internally.

Your traffic to and within Quix is very important to us, and that is why we keep it safe. TLS and the use of certificates at every surface where communication between computers happens ensures we drastically reduce attack vectors and the risk of data falling into the wrong hands.

#### Encryption of Data at Rest

Customer data at rest resides in Azure and AWS. Any persisted data is encrypted with keys managed in a safe and secure manner either by our Cloud Provider vendors or by Quix personnel, adhering to our access policies and procedures.

Data saved on disk is sensitive information and we always treat it as such.

## Separation of Concerns

To give you even further protection and control, we architected Quix on independent environments and firewalls. Logical separation ensures that customers can only access their own data and no one else's: potential malicious usage of the service will not affect the service or data of another.

### Environments

Environments at Quix are hermetically sealed with no reused components between them. Development and Production environments are distinct entities with no cross-talk.

The separation of these concerns allows us to deliver a Quix platform experience in a way that minimizes the chance of errors and mistakes and is a well-supported industry standard of software delivery.

In case you choose to host Quix on your platform, we recommend that you follow the same practices.

### Firewalls

Firewalls in cloud-native infrastructure and applications work differently from how they used to in the days of monolithic apps running on bare metal servers stacked neatly in a server room.

All networking technologies utilised during the delivery of the Quix Platform follow the principle of least privilege; we configure our security groups to only allow the minimum necessary traffic, and we configure our access lists to do the same. We follow industry best practices in architecting these safeguards and constantly monitor and audit them.

## Employee access policies

Quix employees access our key systems with multi-factor authentication enforced. This helps us verify the identity of the person accessing these services and reduce the chance of unauthorised access by way of compromised channels or devices.

At Quix, systems that make up the Quix platform are only ever accessed when necessary and only by authorised personnel. We take our commitment to security and confidentiality seriously.

Restricted access ensures only colleagues in the necessary roles can work on the underlying software and infrastructure stack. This, combined with audit trails built right into our processes and tooling helps us maintain the principle of least privilege, an important security practice.

## Compliance

Quix is aiming to meet and exceed one of the most broadly recognised security standards.

### ISO-27001

## Data at rest
ISO-27001 details IT security management systems and procedures. We are currently pursuing certification under this rigorous standard while actively working towards standardising our written policies and procedures.

- Your data is encrypted at rest using cloud provider (Azure) managed
keys.
An ISO-27001 certification is a quick and easy way to judge the general security posture of an organisation. By obtaining this certification soon, we aim to demonstrate our commitment to information security.

- Your data is phyisically protected at our cloud provider’s location.
We are aiming to complete the internal audit by October 2023 and obtain the certification by the end of the calendar year 2023.
Original file line number Diff line number Diff line change
Expand Up @@ -202,7 +202,6 @@ To learn more, try one of these tutorials:
* [Build a live video processing pipeline using the Transport for London (TfL) traffic cameras and the YOLO ML model for object detection](../image-processing/index.md)
* [Perform sentiment analysis on a stream of Tweets about a given subject](../sentiment-analysis/index.md)
* [Gather and processes data from an RSS feed and get an alert when specific criteria are met](../rss-tutorial/rss-processing-pipeline.md)
* [Stream and visualize real-time telemetry data with an Android app and Streamlit](../telemetry-data/telemetry-data.md)


!!! tip "Getting Help"
Expand Down
21 changes: 21 additions & 0 deletions docs/platform/tutorials/data-science/1-bikedata.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# CitiBike data

Start by getting the real-time bicycle data. Use the Quix CitiBikes connector to get real-time bicycle availability data (it doesn't require a sign up or any keys).

You won't need to write lots of code, as you will use the Quix Code Samples to deploy a prebuilt service that streams data from the New York CitiBikes API:

1. Navigate to `Code Samples` using the left-hand menu and search for `New York` then select the `New York Bikes` tile.

![NY Bikes sample tile](./images/ny-bikes-library-tile.png){width=200px}

2. Click `Setup and deploy`:

a. Leave the `Name` as it is.

b. Ensure the `output` is set to `bikes-topic`.

3. Click `Deploy`.

The precompiled service is deployed to your workspace and begins running immediately.

[Part 2 - Weather data :material-arrow-right-circle:{ align=right }](2-weatherdata.md)
51 changes: 51 additions & 0 deletions docs/platform/tutorials/data-science/2-weatherdata.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Weather data

You now have a working real-time stream of bicycle data. Next, you will integrate the data from a free weather API, adding current and forecasted weather data.

## Create a free Visual Crossing account

!!! info

[Visual Crossing](https://www.visualcrossing.com/){target=_blank} is a leading provider of weather data and enterprise analysis tools to data scientists, business analysts, professionals, and academics.

1. Go to the [Visual Crossing sign up page](https://www.visualcrossing.com/sign-up){target=_blank}.

2. Follow the instructions to create your account.

3. Go to the [Account](https://www.visualcrossing.com/account){target=_blank} page to copy your key.

Keep it safe for later.

## Weather real-time stream

You can now deploy the VisualCrossing connector from the Quix Code Samples:

1. Search the Code Samples for `weather` and select the `VisualCrossing Weather` tile.

2. Click `Setup and deploy`.

3. Leave the `Name` as it is.

4. Ensure the `output` is set to `weather-topic`.

5. Paste your API key into the `api_token` field. This is the API key you obtained from your Visual Crossing account page.

6. Click `Deploy`.

The precompiled service is deployed to your workspace and begins running immediately.

!!! warning "Visual Crossing usage limitation"

The Visual Crossing API has limits on how much data you can access for free and the real weather only changes in real-time (this means slowly).

The free Visual Crossing account is limited to 1000 records per day so in order to prevent your account from being rate limited, the connector is coded to consume data every 2 minutes, however, you can trigger collection of new data by restarting the service as needed. You will do this several times throughout this tutorial.

## Summary

At this stage you have two services running.

One is publishing `New York CitiBike` data to a topic called `bikes-topic` and another is publishing `Visual Crossing` weather data to a topic called `weather-topic`.

![Successfully deployed pipeline](./images/early-success.png)

[Part 3 - Data views :material-arrow-right-circle:{ align=right }](3-data.md)
Loading