Integrating InDepthAnalysis into your application
Prerequisites
In this guide we are building an integration that covers the following scenarios:
- Uploading audio files to Cyanite.ai
- Enqueuing a file analysis
- Setup a webhook for getting notified for when the file analysis has finished
- Fetching the analysis result.
The tutorial will mostly provide Node.js code. However, all listed GraphQL Operations can be used with any other programming language.
You can find the whole source code of the example integration over on Github: cyanite-ai/cyanite-integration-example.
Scaffolding the project
Create project a folder
First of all we create an empty project
mkdir cyanite-integration && cd cyanite-integration && yarn init -y
Install dependencies
Then we install all the required packages:
yarn add -E envalid@6.0.0 express@4.17.1 body-parser@1.19.0 node-fetch@2.6.1
Package | Used for |
---|---|
envalid | validate environment variables |
express | minimalist framework for creating a webhook listener |
body-parser | request body parsing middleware for express |
node-fetch | HTTP request module for sending requests/upload files to the Cyanite.ai API |
Obtaining API credentials
Follow Creating an integration for obtaining an access token and a webhook secret.
The create a new file cyanite-integration/.env
with the following contents:
ACCESS_TOKEN=YOUR_ACCESS_TOKEN
SECRET=YOUR_SECRET
PORT=8080
API_URL=https://api.cyanite.ai/graphql
Make sure you replace YOUR_SECRET
and YOUR_ACCESS_TOKEN
with the corresponding values for your integration.
Uploading and enqueuing files
Pick a file
Scan your music collection for an mp3
that you want to analyze. For receiving the best result your mp3
files should have a bitrate of 320kbs.
You can easily convert your wav
(or any other audio format) files to mp3
with ffmpeg.
ffmpeg -i inputfile.wav -ab 320k outputfile.mp3
Uploading a file
The Upload can be divided into three steps:
- Requesting a file upload
- Uploading the file
- Create an InDepthAnalysis from an uploaded file
Requesting a file upload
For requesting a file upload we use the fileUploadRequest
mutation field.
mutation fileUploadRequest {
fileUploadRequest {
id
uploadUrl
}
}
On the FileUploadRequest we select the id
and the uploadUrl
. We need both fields to proceed.
The uploadUrl
is the url to which we are uploading our file.
The id
is a unique identifier which we need for creating the InDepthAnalysis from our uploaded file.
Let's start by requesting our file upload!
src/file-upload.js
"use strict";
const envalid = require("envalid");
const fs = require("fs");
const fetch = require("node-fetch");
const { API_URL, ACCESS_TOKEN } = envalid.cleanEnv(process.env, {
API_URL: envalid.str(),
ACCESS_TOKEN: envalid.str(),
});
const fileUploadRequestMutation = /* GraphQL */ `
mutation fileUploadRequest {
fileUploadRequest {
id
uploadUrl
}
}
`;
const requestFileUpload = async () => {
const result = await fetch(API_URL, {
method: "POST",
body: JSON.stringify({
query: fileUploadRequestMutation,
}),
headers: {
Authorization: "Bearer " + ACCESS_TOKEN,
"Content-Type": "application/json",
},
}).then((res) => res.json());
console.log("[info] fileUploadRequest response: ");
console.log(JSON.stringify(result, undefined, 2));
};
const main = async (filePath) => {
const { id, uploadUrl } = await requestFileUpload(filePath);
console.log({ id, uploadUrl });
};
main(process.argv[2]).catch((err) => {
console.error(err);
process.exitCode = 1;
});
Uploading the file
Now that we have all the necessary information for uploading the file, we can perform our file upload.
For uploading we use the HTTP PUT
method. You can verify that the upload was successful by checking the HTTP status code of the response which should be equal to 200.
src/file-upload.js
"use strict";
const envalid = require("envalid");
const fs = require("fs");
const fetch = require("node-fetch");
const { API_URL, ACCESS_TOKEN } = envalid.cleanEnv(process.env, {
API_URL: envalid.str(),
ACCESS_TOKEN: envalid.str(),
});
const fileUploadRequestMutation = /* GraphQL */ `
mutation fileUploadRequest {
fileUploadRequest {
id
uploadUrl
}
}
`;
const requestFileUpload = async () => {
const result = await fetch(API_URL, {
method: "POST",
body: JSON.stringify({
query: fileUploadRequestMutation,
}),
headers: {
Authorization: "Bearer " + ACCESS_TOKEN,
"Content-Type": "application/json",
},
}).then((res) => res.json());
console.log("[info] fileUploadRequest response: ");
console.log(JSON.stringify(result, undefined, 2));
return result.data.fileUploadRequest;
};
const uploadFile = async (filePath, uploadUrl) => {
const result = await fetch(uploadUrl, {
method: "PUT",
body: fs.createReadStream(filePath),
headers: {
"Content-Type": fs.statSync(filePath).size,
},
}).then((res) => {
if (res.status !== 200) {
throw Error("Failed to upload file.");
}
return res.text();
});
console.log(result);
};
const createInDepthAnalysis = async (fileUploadRequestId) => {};
const main = async (filePath) => {
const { id, uploadUrl } = await requestFileUpload(filePath);
await uploadFile(filePath, uploadUrl);
};
main(process.argv[2]).catch((err) => {
console.error(err);
process.exitCode = 1;
});
Create an InDepthAnalysis from an uploaded file
Now we have uploaded the file. Up next we now need to create an InDepthAnalysis from our uploaded file.
For creating the InDepthAnalysis via the Cyanite.ai API we use the inDepthAnalysisCreate
mutation.
mutation inDepthAnalysisCreate($data: InDepthAnalysisCreateInput!) {
inDepthAnalysisCreate(data: $data) {
__typename
... on InDepthAnalysisCreateResultSuccess {
inDepthAnalysis {
id
status
}
}
... on Error {
message
}
}
}
The inDepthAnalysisCreate
mutation field returns the union type InDepthAnalysisCreateResult
. A union type specifies that the mutation can return a variety of possible results.
Depending on the result we use InlineFragments for specifying our data requirements.
InDepthAnalysisCreateResultSuccess
: The mutation finished like expected, we can query for the id, title, status or any other fields that belong to our InDepthAnalysis.
Error
: An error occurred. The details will be included in a message attached to the error object. The errors can be one of the following types:
InDepthAnalysisRecordLimitExceededError
: You've exceeded your analysis limit.InDepthAnalysisInvalidTagError
: You can attach a tag to your analysis (see tagging InDepthAnalysis). If the tag you are trying to add is invalid, this error will be returned.
The Error
interface type is implemented by all Errors
. Instead of writing a SelectionSet for each error type we can use an InlineFragment on Error
for gathering the error message. In case another error type provides some more detailed info that we want to use we can add additional SelectionSets. For now, we wanna keep things simple.
We also query for the type name (__typename
). Each ObjectType (e.g. InDepthAnalysisCreateResultSuccess
and InDepthAnalysisRecordLimitExceededError
) have a __typename
field. We use it for distinguishing between the single types.
Note: Despite us selecting data on the interface Error
, __typename
can never be Error
. An interface is an abstract type. However, all types that implement the Error
interface will end with Error
(e.g. InDepthAnalysisRecordLimitExceededError
). In the integration code, we can therefore simply check whether __typename
ends with Error
.
src/file-upload.js
"use strict";
const envalid = require("envalid");
const fs = require("fs");
const fetch = require("node-fetch");
const { API_URL, ACCESS_TOKEN } = envalid.cleanEnv(process.env, {
API_URL: envalid.str(),
ACCESS_TOKEN: envalid.str(),
});
const fileUploadRequestMutation = /* GraphQL */ `
mutation fileUploadRequest {
fileUploadRequest {
id
uploadUrl
}
}
`;
const inDepthAnalysisCreateMutation = /* GraphQL */ `
mutation inDepthAnalysisCreate($data: InDepthAnalysisCreateInput!) {
inDepthAnalysisCreate(data: $data) {
__typename
... on InDepthAnalysisCreateResultSuccess {
inDepthAnalysis {
id
status
}
}
... on Error {
message
}
}
}
`;
const requestFileUpload = async () => {
const result = await fetch(API_URL, {
method: "POST",
body: JSON.stringify({
query: fileUploadRequestMutation,
}),
headers: {
Authorization: "Bearer " + ACCESS_TOKEN,
"Content-Type": "application/json",
},
}).then((res) => res.json());
console.log("[info] fileUploadRequest response: ");
console.log(JSON.stringify(result, undefined, 2));
return result.data.fileUploadRequest;
};
const uploadFile = async (filePath, uploadUrl) => {
const result = await fetch(uploadUrl, {
method: "PUT",
body: fs.createReadStream(filePath),
headers: {
"Content-Type": fs.statSync(filePath).size,
},
}).then((res) => res.text());
console.log(result);
};
const createInDepthAnalysis = async (fileUploadRequestId) => {
const result = await fetch(API_URL, {
method: "POST",
body: JSON.stringify({
query: inDepthAnalysisCreateMutation,
variables: {
data: {
fileName: "My first InDepthAnalysis",
uploadId: fileUploadRequestId,
},
},
}),
headers: {
Authorization: "Bearer " + ACCESS_TOKEN,
"Content-Type": "application/json",
},
}).then((res) => res.json());
console.log("[info] inDepthAnalysisCreate response: ");
console.log(JSON.stringify(result, undefined, 2));
return result.data.inDepthAnalysisCreate;
};
const main = async (filePath) => {
const { id, uploadUrl } = await requestFileUpload(filePath);
await uploadFile(filePath, uploadUrl);
await createInDepthAnalysis(id);
};
main(process.argv[2]).catch((err) => {
console.error(err);
process.exitCode = 1;
});
Let's execute this script for completing our first file upload to the Cyanite.ai API 🚀
node src/file-upload.js
The terminal output should look similar to this:
n1ru4l@outerspace:~/cyanite-integration$ node src/file-upload.js "/Users/n1ru4l/Documents/music/2019_04_11/Jam Thieves & Voltage - LSD.mp3"
/Users/n1ru4l/Documents/music/2019_04_11/Jam Thieves & Voltage - LSD.mp3
[info] start transferring file
[info] inDepthAnalysisCreate response:
{
"data": {
"inDepthAnalysisCreate": {
"__typename": "InDepthAnalysisCreateResultSuccess",
"inDepthAnalysis": {
"id": "916",
"title": "My first InDepthAnalysis",
"status": "NOT_STARTED"
}
}
}
}
Analysis Status
As you can see the file is now successfully uploaded but it's status is NOT_STARTED
.
The field InDepthAnalysis.status
has the enum
type AnalysisStatus
.
It can be either one of the following:
AnalysisStatus | Description |
---|---|
NOT_STARTED | File was successfully uploaded |
ENQUEUED | File Analysis is enqueued and awaiting processing |
PROCESSING | File is being processed |
FINISHED | File Processing has finished successfully |
FAILED | File processing has failed |
Enqueuing a file analysis
In order to process our uploaded file we need to send an additional mutation to the Cyanite.ai API. We can do this with the inDepthAnalysisEnqueueAnalysis
mutation.
mutation inDepthAnalysisEnqueueAnalysis(
$input: InDepthAnalysisEnqueueAnalysisInput!
) {
inDepthAnalysisEnqueueAnalysis(data: $input) {
__typename
... on InDepthAnalysisEnqueueAnalysisResultSuccess {
success
inDepthAnalysis {
id
status
}
}
... on Error {
message
}
}
}
Like the inDepthAnalysisCreate
mutation we also include InlineFragments for the expected result (InDepthAnalysisEnqueueAnalysisResultSuccess
) and unexpected results (InDepthAnalysisNotFoundError
, InDepthAnalysisLimitExceededError
, InDepthAnalysisAlreadyEnqueuedError
).
The latter are all covered by the Error
Fragment.
src/file-enqueue-analysis.js
"use strict";
const envalid = require("envalid");
const fetch = require("node-fetch");
const { API_URL, ACCESS_TOKEN } = envalid.cleanEnv(process.env, {
API_URL: envalid.str(),
ACCESS_TOKEN: envalid.str(),
});
const inDepthAnalysisEnqueueAnalysis = async (inDepthAnalysisId) => {
const mutationDocument = /* GraphQL */ `
mutation inDepthAnalysisEnqueueAnalysis(
$input: InDepthAnalysisEnqueueAnalysisInput!
) {
inDepthAnalysisEnqueueAnalysis(data: $input) {
__typename
... on InDepthAnalysisEnqueueAnalysisResultSuccess {
success
inDepthAnalysis {
id
status
}
}
... on Error {
message
}
}
}
`;
const result = await fetch(API_URL, {
method: "POST",
body: JSON.stringify({
query: mutationDocument,
variables: { input: { inDepthAnalysisId } },
}),
headers: {
Authorization: "Bearer " + ACCESS_TOKEN,
"Content-Type": "application/json",
},
}).then((res) => res.json());
console.log("[info] inDepthAnalysisEnqueueAnalysis response: ");
console.log(JSON.stringify(result, undefined, 2));
if (result.data.inDepthAnalysisEnqueueAnalysis.__typename.endsWith("Error")) {
throw new Error(result.data.inDepthAnalysisCreate.message);
}
return result.data;
};
const main = async (inDepthAnalysisId) => {
await inDepthAnalysisEnqueueAnalysis(inDepthAnalysisId);
};
main(process.argv[2]).catch((err) => {
console.error(err);
process.exitCode = 1;
});
Let's enqueue the file from earlier. Please note that you must use the id of the file you have uploaded instead.
n1ru4l@outerspace:~/cyanite-integration$ node src/file-enqueue-analysis.js "916"
[info] inDepthAnalysisEnqueueAnalysis response:
{
"data": {
"inDepthAnalysisEnqueueAnalysis": {
"__typename": "InDepthAnalysisEnqueueAnalysisResultSuccess",
"success": true,
"inDepthAnalysis": {
"id": "916",
"status": "ENQUEUED"
}
}
}
}
We have now successfully uploaded and enqueued our first file analysis! Hurray 🎉!
Listening to Webhook Events
The process of analyzing our files is asynchronous, that means that there is no direct connection being kept alive until the file has finished processing. Instead of constantly polling the status of a file, the Cyanite.ai API allows registering a Webhook Endpoint that will be notified once a file has been processed/failed to process.
This approach is much more intuitive than sending unnecessary requests or keeping a connection alive.
Note: For your production integration you should have a server that is facing the public internet. For this demonstration we are going to use ngrok for exposing the local port of our machine to the internet.
Starting the ngrok proxy
npx ngrok http 8080
Forwarding http://xxxxxde.ngrok.io -> http://localhost:8080
Forwarding https://xxxxxde.ngrok.io -> http://localhost:8080
Open the integration section on Cyanite.ai and edit your integration by setting your webhook url http://b6d29bde.ngrok.io/incoming-webhook
.
Your url will slightly differ from the one above
Creating a webhook
Up next, let's finally write our webhook code.
src/webhook.js
"use strict";
const crypto = require("crypto");
const envalid = require("envalid");
const express = require("express");
const bodyParser = require("body-parser");
const app = express();
const { PORT, SECRET } = envalid.cleanEnv(process.env, {
PORT: envalid.num(),
SECRET: envalid.str(),
});
const isSignatureValid = (secret, signature, message) => {
const hmac = crypto.createHmac("sha512", secret);
hmac.write(message);
hmac.end();
const compareSignature = hmac.read().toString("hex");
return signature === compareSignature;
};
const WEBHOOK_ROUTE_NAME = "/incoming-webhook";
app.use(bodyParser.json());
app.post(WEBHOOK_ROUTE_NAME, (req, res) => {
if (!req.body) {
return res.sendStatus(422); // Unprocessable Entity
}
console.log("[info] incoming event:");
console.log(JSON.stringify(req.body, undefined, 2));
if (req.body.type === "TEST") {
console.log("[info] processing test event");
return res.sendStatus(200);
}
// verifying the request signature is not required but recommended
// by verifying the signature you can ensure the incoming request was sent by Cyanite.ai
if (
!isSignatureValid(SECRET, req.headers.signature, JSON.stringify(req.body))
) {
console.log("[info] signature is invalid");
return res.sendStatus(400);
}
console.log("[info] signature is valid");
if (req.body.type === "IN_DEPTH_ANALYSIS_FINISHED") {
console.log("[info] processing finish event");
// You can use the result here, but keep in mind that you should probably process the result asynchronously
// The request of the incoming webhook will be canceled after 3 seconds.
}
// Do something with the result here
return res.sendStatus(200);
});
app.listen(PORT, () => {
console.log(
`Server listening on http://localhost:${PORT}${WEBHOOK_ROUTE_NAME}`
);
});
Afterwards we can start the server (node src/webhook.js
).
Receiving a webhook event
Before receiving any event, we first need to initiate an action that triggers an event. We can do that by re-enqueueing our previously uploaded file:
n1ru4l@outerspace:~/cyanite-integration$ node src/file-enqueue-analysis.js "916"
[info] inDepthAnalysisEnqueueAnalysis response:
{
"data": {
"inDepthAnalysisEnqueueAnalysis": {
"__typename": "InDepthAnalysisEnqueueAnalysisResultSuccess",
"success": true,
"inDepthAnalysis": {
"id": "916",
"status": "ENQUEUED"
}
}
}
}
After a few seconds the webhook terminal should output something similar to this:
n1ru4l@outerspace:~/cyanite-integration$ yarn start
yarn run v1.15.2
$ node src/webhook.js
Server listening on http://localhost:8080/incoming-webhook
[info] incoming event:
{
"type": "IN_DEPTH_ANALYSIS_FINISHED",
"data": {
"inDepthAnalysisId": "916"
}
}
[info] signature is valid
[info] processing finish event
That means our event was delivered successfully 🚀!
Fetching the analysis result
Depending on what we need, we can fetch different parts of the analysis result. E.g. we could either query for similar sounding spotify tracks, the detected genre or both at the same time! You can explore all available fields using GraphiQL.
For this demonstration we are going to fetch the analyzed genres.
Feel free to explore the available data using GraphiQL.
Here you can find a more complex Query:
query inDepthAnalysis($inDepthAnalysisId: ID!) {
inDepthAnalysis(recordId: $inDepthAnalysisId) {
__typename
... on InDepthAnalysis {
id
status
result {
fileInfo {
duration
}
segmentData {
timestamps
valence
arousal
moodScores {
type
name
values
}
genreScores {
type
name
values
}
}
genres {
title
confidence
}
moodMeanScores {
type
name
value
}
}
}
... on Error {
message
}
}
}
Also check out the Query builder which can help you writing queries for the classifier data you need.
Conclusion
We successfully built our first Cyanite.ai Integration!
You can also find the whole project on Github
In case you have any open questions or suggestions on how we could improve this guide contact us via sales@cyanite.ai.