NAV Navbar
Node.js cURL

Getting Started

Introduction

Sample Code Will Be Over Here

Welcome to Hybrik -- Dolby's cloud media processing service. Every element of the Hybrik workflow can be managed through Hybrik's RESTful API. This includes transcoding, quality control, data transfers, etc. Even complex workflows with conditional execution and scripting can be designed and implemented using the Hybrik API. When defining transcodes through the API, users can reference pre-configured presets or explicitly define the complete encode pipeline. The API uses JSON structures for the definition of each element.

All Hybrik API submissions are via the HTTP commands POST, PUT, GET, and DELETE. The API uses HTTP authentication for permitting API access, and a user-specific login call which returns an expiring token. This token needs to be passed into all API calls as part of the header.

A typical API session to submit and track a transcoding job would look like this:

At the bottom of this document are downloadable samples for both JSON jobs as well as JavaScript libraries for easily incorporating the Hybrik API into your projects.

REST Arguments

The Hybrik REST API follows REST conventions regarding the placement of arguments in a query string or in a request body. Some limitations apply however: The maximum number of array elements which can be passed via a query string is 250. If any array in your request is exceeding this length, the argument must be passed in the request body.

Generally, Hybrik will attempt to parse arguments from both locations, to overcome issues with maximum url length etc.

Jobs and Tasks

Example Job JSON

{
  "name": "Hybrik API Example#1 - simple transcode",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://hybrik-examples/public/sources/sample1.mp4"
          }
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "task": {
          "retry_method": "fail"
        },
        "payload": {
          "location": {
            "storage_provider": "s3",
            "path": "s3://hybrik-examples/public/output/example1"
          },
          "targets": [
            {
              "file_pattern": "%s.mp4",
              "existing_files": "replace",
              "container": {
                "kind": "mp4"
              },
              "video": {
                "codec": "h264",
                "width": 640,
                "height": 360,
                "frame_rate": 23.976,
                "bitrate_kb": 600
              },
              "audio": [
                {
                  "codec": "heaac_v2",
                  "channels": 2,
                  "sample_rate": 44100,
                  "bitrate_kb": 128
                }
              ]
            }
          ]
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_task"
            }
          ]
        }
      }
    ]
  }
}

A Job is defined in JSON notation, and tells the Hybrik service precisely how to execute that Job. Note that a Job is made up of one or more Tasks, and Tasks are then distributed across the available machines. Therefore a Job may end up executing on many different machines. Both Jobs and Tasks can have varying priorities within the system to allow you to finely control performance and latency. Tasks may even be set to retry if they fail, which allows you to manage intermittent error conditions like network accessibility. Depending on your workflow, some Tasks may not get executed. For example, a Job execution might look like this:

If the file passes the QC test, Tasks 1, 2, 3, 4, 5, and 6 execute. If the file fails the QC test, Tasks 1, 2, 3, 7, and 8 execute. A Job specifies not only which Tasks will be executed, but also how those Tasks are connected to each other.

The basic structure of a Job JSON looks like this:

A sample JSON file describing a Job is shown on the right. This Job takes a source file from an Amazon S3 location (s3://hybrik-examples/public/sources/sample1.mp4) and puts the transcode result into a different S3 location (s3://hybrik-examples/public/output/example1). The transcode video parameters are set to use the h.264 codec, with a width of 640 pixels, a height of 360 pixels, a frame rate of 23.976 frames per second, and a bitrate of 600kb/sec. The audio format is set to use the HE-AAC V2 codec, with 2 channels of audio, at a sample rate of 44.1kHz and a bitrate of 128kb/sec.

Jobs may be simple and define only a Source and Transcode element, or they may be very complex with dozens of different tasks and conditional branching.

User Authentication

Version and Compliance

The Hybrik API version is specified in the request URI (eg. "v1" in "https://api-demo.hybrik.com/v1/jobs"). All calls must include this version number.

In addition, an API compliance date must be passed with each call. This is passed via the 'X-Hybrik-Compliance' header and has the format 'YYYYMMDD' (eg. 20190406). The compliance date essentially states "I, the client application, comply to the API up to this date". If we introduce changes to the API, we will ensure that the response is still valid for the client based on this compliance date value. That way you don't need to be worried about changes to our API. Just tell us when you started using it and we will conform to your usage.

The entry point for testing the API will be https://api-demo.hybrik.com/v1. Note that once you are a customer of Hybrik, you will be given a different API entry point to operate against.

Authentication Login

Logging in to the Hybrik API

Hybrik uses 2 login mechanisms: basic http authentication for permitting API access, and a user specific login call, returning an expiring token. This token shall be passed to all api calls in the 'X-Hybrik-Sapiauth' header.

Expiration time for the token is 30 minutes - however, it will be auto-extended (in 30 minute intervals) on successful API calls. After expiration, a new login call, returning a new token, is required.

The credentials for the basic http authentication are unique per customer account. That authentication does not permit access to any customer resources, it only safe-guards access to the api itself. The Basic HTTP Auth credentials can be obtained by the account owner via the Hybrik web interface in the Account > Info page.

Do not share your API credentials or Basic HTTP Authentication credentials with anyone.

The example JavaScript code on the right is based on Node.js and uses the "hybrik_connector.js" module included in the sample code. The hybrik_connector module can be used to create an API access object and submit http-based calls to the Hybrik API. The basic functions are:

Generate an API access object
var hybrik_api = HybrikAPI(api_url, compliance_date, oapi_key, oapi_secret, user_key, user_secret)

Connect to the API
hybrik_api.connect()

*Submit an API call *
hybrik_api.call_api(http_method, api_method, url_params, body_params)

The sample code can be reviewed in the section titled: API Sample JavaScript

HTTP Request


// Construct an API access object, connect to the API, and then submit a job
var hybrik_api = new HybrikAPI(
    api_config.HYBRIK_URL, 
    api_config.HYBRIK_COMPLIANCE_DATE, 
    api_config.HYBRIK_OAPI_KEY, 
    api_config.HYBRIK_OAPI_SECRET, 
    api_config.HYBRIK_AUTH_KEY, 
    api_config.HYBRIK_AUTH_SECRET
);

// connect to the API
hybrik_api.connect()
    .then(function () {
        // submit the job by POSTing the '/jobs' command
        return hybrik_api.call_api('POST', '/jobs', null, theJob)
        .then(function (response) {
            console.log('Job ID: ' + response.id);
            return response.id;
        })

        // error handling and such here


$ curl -u OAPI_KEY:OAPI_SECRET -X POST https://api-demo.hybrik.com/v1/login \
  -d '{
  "auth_key": "john.doe@customer.hybrik.com",
  "auth_secret": "very secret password"
}' \
  -H "Content-Type: application/json" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Login to the Hybrik API. Requires using basic authentication with the API credentials provided by Hybrik

POST /login

Required Parameters

Name Type Description
auth_key string User name or email address.
Length: 0..512
auth_secret string User password.
Length: 0..512

Job Management

These API calls allow for the creation and management of Hybrik jobs. This includes getting information about a specific job or getting a list of jobs.

Create Job

//Submit a job
hybrik_api.call_api('POST', '/jobs', null, {
    name: "My job",
    priority: 100,
    tags: ["my_tag", "my_other_tag"],
    payload: {  
        // this is the job JSON
    }
})
$ curl -u OAPI_KEY:OAPI_SECRET -X POST https://api-demo.hybrik.com/v1/jobs \
  -d '{
  "name": "Fifth Element, Web Streaming",
  "user_tag": "myjob_012345",
  "schema": "hybrik",
  "payload": "<job payload>",
  "priority": 100,
  "expiration": 1440,
  "task_tags": [
    "high_performance",
    "us-west-1"
  ],
  "task_retry": {
    "count": 3,
    "delay_sec": 180
  }
}' \
  -H "Content-Type: application/json" \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
  "id": "123456789"
}

Create a new job.

HTTP Request

POST /jobs

Required Parameters

Name Type Description
name string The visible name of a jobs
payload string Depending on the schema, this must be a JSON object (schema = 'hybrik') or a serialized XML document (schema = 'rhozet')
Length: 0..128000
schema string Select the api schema compatibility of the job payload. Defaults to "hybrik" if not specified.
one of:"hybrik" or "rhozet" or "api_test"

Optional Parameters

Name Type Description
expiration integer Expiration (in minutes) of the job. A completed job will expire and be deleted after [expiration] minutes. Default is 30 days.
default: 43200
Range: value <= 259200
priority integer priority (1: lowest, 254: highest) of a job
default: 100
Range: 1 <= value <= 254
task_retry:count integer The number of times to attempt to retry the task if there is a failure.
default: 0
task_retry:delay_sec integer The number of seconds to wait before a retry attempt.
default: 45
task_tags array The tags all the tasks of this job will have. Note that a render node needs to provide these tags for a task to be executed on that node.
user_tag nullable string An optional, free-form, persistent identifier for this job. Intent is to provide a container for a machine trackable, user specified, identifier. For human readable identifiers, please use the name field of a job. Hybrik will not verify this identifier for uniqueness.
Length: 0..192

Update Job

//Modify Job #12345
hybrik_api.call_api('PUT', '/jobs/12345', null, {
  name: "My new name", 
  priority: 250, 
  expiration: 1440 
})
$ curl -u OAPI_KEY:OAPI_SECRET -X PUT https://api-demo.hybrik.com/v1/jobs/$JOB_ID \
  -d '{
  "name": "Fifth Element, Web Streaming",
  "user_tag": "myjob_012345",
  "priority": 100,
  "expiration": 1440
}' \
  -H "Content-Type: application/json" \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
  "id": "12345"
}

Modifies an existing job.

HTTP Request

PUT /jobs/{job_id}

Optional Parameters

Name Type Description
expiration integer The expiration (in minutes) of the job. A completed job will expire and be deleted after [expiration] minutes. Default is 30 days.
default: 43200
Range: value <= 259200
name string The visible name of a job.
priority integer The job priority (1: lowest, 254: highest).
default: 100
Range: 1 <= value <= 254
user_tag nullable string An optional, free-form, persistent identifier for this job. Intent is to provide a container for a machine trackable, user specified, identifier. For human readable identifiers, please use the name field of a job. Hybrik will not verify this identifier for uniqueness.
Length: 0..192

Stop Job

//Stop Job #12345
hybrik_api.call_api('PUT', '/jobs/12345/stop')

$ curl -u OAPI_KEY:OAPI_SECRET -X PUT https://api-demo.hybrik.com/v1/jobs/$JOB_ID/stop \
  -H "Content-Type: application/json" \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
  "id": "12345"
}

Stops a job.

HTTP Request

PUT /jobs/{job_id}/stop

Delete Job

//Delete Job #12345
hybrik_api.call_api('DELETE', '/jobs/12345')

$ curl -u OAPI_KEY:OAPI_SECRET -X DELETE https://api-demo.hybrik.com/v1/jobs/$JOB_ID \
  -H "Content-Type: application/json" \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
  "id": "12345"
}

Delete an existing job.

HTTP Request

DELETE /jobs/{job_id}

Delete Jobs

//Delete Jobs 12345, 56789, and 34567
hybrik_api.call_api('DELETE', '/jobs', null, {
  ids: [12345, 56789, 34567]
})
$ curl -u OAPI_KEY:OAPI_SECRET -X DELETE https://api-demo.hybrik.com/v1/jobs \
  -d '{
  "ids": [
    "12344346",
    "12344347"
  ]
}' \
  -H "Content-Type: application/json" \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
  "items": [
       "12345", 
       "56789", 
       "34567"
  ]
}

Delete a set of existing jobs.

HTTP Request

DELETE /jobs

Required Parameters

Name Type Description
ids array An array of the job numbers to be deleted. Example: ["12345", "12346"]

Get Job Definition

//Get job definition for Job #12345
hybrik_api.call_api('GET', '/jobs/12345/definition')

$ curl -u OAPI_KEY:OAPI_SECRET https://api-demo.hybrik.com/v1/jobs/$JOB_ID/definition \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
  "name": "My Job",
  "priority": 100,
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "task": {
          "retry_method": "fail",
          "tags": [
          ]
        },
        "payload": {
          // transcode payload
        }
        }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_task"
            }
          ]
        }
      }
    ]
  }
}

Get the job structure in the Hybrik schema notation.

HTTP Request

GET /jobs/{job_id}/definition

Get Job Result

//Get job result for Job #12345
hybrik_api.call_api('GET', '/jobs/12345/result')

$ curl -u OAPI_KEY:OAPI_SECRET https://api-demo.hybrik.com/v1/jobs/$JOB_ID/result \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
    "errors": [],
    "job": {
        "id": 476993,
        "is_api_job": 1,
        "priority": 100,
        "creation_time": "2016-11-11T23:30:51.000Z",
        "expiration_time": "2016-12-11T23:30:52.000Z",
        "user_tag": null,
        "status": "completed",
        "render_status": "completed",
        "task_count": 1,
        "progress": 100,
        "name":  "Transcode Job: s3:/my_bucket/my_folder/my_file.mp4",
        "first_started": "2016-11-11T23:31:21.000Z",
        "last_completed": "2016-11-11T23:31:29.000Z"
    },
    "tasks": [
        {
            "id": 2196469,
            "priority": 100,
            "name": "Transcode Job: s3:/my_bucket/my_folder/my_file.mp4",
            "retry_count": -1,
            "status": "completed",
            "assigned": "2016-11-11T23:30:51.000Z",
            "completed": "2016-11-11T23:30:51.000Z",

        /* information about each task here */

Get the job result after completion or failure.

HTTP Request

GET /jobs/{job_id}/result

Get Job Info

//Get job info for Job #12345
hybrik_api.call_api('GET', '/jobs/12345/info',  { 
  fields: [ 
    "id", 
    "name",
    "progress",
    "status",
    "start_time",
    "end_time"
  ]
})

$ curl -u OAPI_KEY:OAPI_SECRET https://api-demo.hybrik.com/v1/jobs/$JOB_ID/info \
 -G \
  -d fields[]=id \
  -d fields[]=progress \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
  "id": "12345",
  "name": "Fifth Element, Web Streaming",
  "progress": 42,
  "status": "running",
  "start_time": "2016-01-01T12:00:00Z",
  "end_time": "2016-01-01T12:10:30Z"
}

Get information about a specific job.

HTTP Request

GET /jobs/{job_id}/info

Optional Parameters

Name Type Description
fields array Specify which fields to include in the response. Possible values are: id, name, user_tag, priority, status, substatus, progress, creation_time, start_time, end_time, expiration_time, and error. Example: ["id", "progress"]

List Jobs

//Get a list of jobs from #12345 to #12347
hybrik_api.call_api('GET', '/jobs/info',  { 
  ids: [
    12345,
    12346, 
    12347
  ],  
  fields: [ 
    "id", 
    "name",
    "progress", 
    "status",
    "start_time",
    "end_time"
  ]
})

$ curl -u OAPI_KEY:OAPI_SECRET https://api-demo.hybrik.com/v1/jobs/info \
 -G \
  -d ids[]=12344346 \
  -d ids[]=12344347 \
  -d filters[]=%7B%22field%22%3D%3E%22status%22%2C+%22values%22%3D%3E%5B%22running%22%5D%7D \
  -d fields[]=id \
  -d fields[]=progress \
  -d sort_field=id \
  -d order=asc \
  -d skip=1000 \
  -d take=100 \
  -H "X-Hybrik-Sapiauth: api_auth_token" \
  -H "X-Hybrik-Compliance: YYYYMMDD"

Response Object

{
    "items": [
        {
            "id": "12345",
            "name": "My first job",
            "status": "completed",
            "start_time": "2016-11-11T23:30:37.000Z",
            "end_time": "2016-11-11T23:30:45.000Z",
            "progress": 100
        },
        {
            "id": "12346",
            "name": "My second job",
            "status": "completed",
            "start_time": "2016-11-11T23:30:37.000Z",
            "end_time": "2016-11-11T23:30:40.000Z",
            "progress": 100
        },
        {
            "id": "12347",
            "name": "My third job",
            "status": "running",
            "start_time": "2016-11-11T23:31:21.000Z",
            "end_time": "2016-11-11T23:31:29.000Z",
            "progress": 62
        }
    ]
}

Provide a filtered, ordered, list of job information records.

HTTP Request

GET /jobs/info

Optional Parameters

Name Type Description
ids array A list of job ids, filtering the records to be returned.
default: []
fields array An array specifying which fields to include in the response. Possible values are: id, name, user_tag, priority, status, substatus, progress, start_time, creation_time, end_time, expiration_time, error
default: ["id"]
filters/field string
one of:"name" or "user_tag" or "status" or "substatus"
filters/values array The job filter match values.
order string The sort order of returned jobs
default: "asc"
one of:"asc" or "desc"
skip integer Specify number of records to omit in the result list
default: 0
sort_field string The sort field of returned jobs.
default: "id"
one of:"id" or "name" or "user_tag" or "priority" or "status" or "substatus" or "progress" or "creation_time" or "end_time" or "start_time" or "expiration_time"
take integer Specify number of records to retrieve.
default: 100
Range: value <= 1000

Count Jobs

//Get a count of all the jobs ("queued", "active", "completed", "failed")
hybrik_api.call_api('GET', '/jobs/count')

$ curl -u OAPI_KEY:OAPI_SECRET https://api-demo.hybrik.com/v1/curl -u OAPI_KEY:OAPI_SECRET https://api-demo.hybrik.com/v1/jobs/count \
 -G \
  -d filters[0][field]=status \
  -d filters[0][values][]=active \
  -H "X-Hybrik-Sapiauth: xxx" \
  -H "X-Hybrik-Compliance: 20190101"

Response Object

{
  "count": 107
}

Provides the count of jobs of the specified type(s).

HTTP Request

GET /jobs/count

Optional Parameters

Name Type Description
filters/field string The type of field to filter on.
one of:"name" or "user_tag" or "status" or "substatus"
filters/values array Filter match values.

Job Task List

//Get a list of the tasks associated with job #12345
hybrik_api.call_api('GET', '/jobs/12345/tasks',  { 
  fields: [
    "id", 
    "name", 
    "status",
    "progress", 
    "machine_uid"
  ]
})
$ curl -u OAPI_KEY:OAPI_SECRET https://api-demo.hybrik.com/v1/jobs/$JOB_ID/tasks \
 -G \
  -d fields[]=id \
  -d fields[]=progress \
  -H "X-Hybrik-Sapiauth: api_auth_token"

Response Object

{
    "items": [
        {
            "id": "2196469",
            "name": "API Job Trigger - My Job",
            "status": "completed",
            "progress": 100
        },
        {
            "id": "2196470",
            "name": "Transcode - My Job",
            "status": "completed",
            "progress": 100,
            "machine_uid": "i-06a65147792f66a44"
        },
                {
            "id": "2196471",
            "name": "Anlayze - My Job",
            "status": "running",
            "progress": 67,
            "machine_uid": "i-06a4a678234e46"
        },
                {
            "id": "2196472",
            "name": "QC - My Job",
            "status": "queued",
            "progress": 0,
            "machine_uid": "i-34556a743357e53"
        }
    ]

}

Provide a list of job tasks.

HTTP Request

GET /jobs/{job_id}/tasks

Optional Parameters

Name Type Description
fields array Specify which fields to include in the response. Possible values are: id, name, kind, status, progress, creation_time, start_time, end_time, machine_uid, and error.
default: ["id"]

Job JSON

Overview

A Hybrik Job is composed of Elements and Connections. The relationship between these is as follows:

When Hybrik is processing a Job, it breaks it down into Tasks. Generally, a Task has a one-to-one relationship with a Job Element. There are some Elements, however, that can be broken into smaller Tasks. For example, a transcode can be broken into many Tasks, where each Task is rendering a small section of the overall transcode.

This reference contains descriptions of all of the JSON objects that can be used in Hybrik. Whenever an object contains other objects, there is a hyperlink to the sub-object.

Jobs

Job JSON Example

{
  "name": "Hybrik API Example#1",
  "priority": 100,
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://hybrik-examples/public/sources/sample1.mp4"
          }
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "payload": {
          "location": {
            "storage_provider": "s3",
            "path": "s3://hybrik-examples/public/output/transcode/example1"
          },
          "targets": [
            {
              "file_pattern": "{source_basename}.mp4",
              "existing_files": "replace",
              "container": {
                "kind": "mp4"
              },
              "video": {
                "codec": "h264",
                "width": 640,
                "height": 360,
                "frame_rate": 23.976,
                "bitrate_kb": 600
              },
              "audio": [
                {
                  "codec": "heaac_v2",
                  "channels": 2,
                  "sample_rate": 44100,
                  "bitrate_kb": 128
                }
              ]
            }
          ]
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_task"
            }
          ]
        }
      }
    ]
  }
}

Name Type Description
name string A name for the job. This will be displayed in the Job window. It does not have to be unique, but it helps to find jobs when they are given searchable names.
payload object The job payload contains all of the structural information about the job. The payload consists of an elements array and a connections array. The elements array defines the various job tasks and the connections array defines how these elements are connected.
schema string Optional. Hybrik will be supporting some third-party job schemas, which can be specified in this string. The default is "hybrik".
priority integer Optional. The priority of a job (1 = lowest, 254 = highest)
default: 100
range: 1 <= value <= 254
user_tag string Optional. The purpose of the user_tag is to provide a machine-trackable, user-specified, identifier. For a human readable identifier, please use the name field of a job. Hybrik will not verify the uniqueness of this identifier.
length: 0..192
expiration integer Optional. Expiration (in minutes) of the job. A completed job will expire and be deleted after [expiration] minutes. Default is 30 days.
default: 43200
range: value <= 259200
task_retry object Optional. Object defining the default task retry behavior of all tasks in this job. Task retry determines the maximum number of times to retry a task as well as the delay between each attempt.
task_tags array Optional. This array contains the task_tags that all the tasks of this job will have. When a task goes to be executed, it will only be executed on machine nodes that have a matching task_tag. For example, if a task is tagged with the tag "high_performance" then it will only run on machines that are also tagged "high_performance"
definitions object Optional. Global string replacements can be defined in this section. Anything in the Job JSON that is enclosed with double parentheses such as {{to_be_replaced}} will be replaced.

Job Payload

Example Job Payload Object

{
  "name": "My Job Name",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload":{
            /* source payload */       
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "task": {
          "retry_method": "fail"
        },
        "payload": {
          /* transcode task payload */
          }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_task"
            }
          ]
        }
      }
    ]
  }
}
Name Type Description
elements array An array defining the various job elements. Each element object has a uid that uniquely identifies it, a kind that specifies the type of object, a task object that defines the generic task behavior, and a payload.
connections array An array defining how the various task elements are connected.

Definitions

Example Job Definitions Object

{
  "definitions": {
    "source": "s3://my_bucket/my_folder/my_file.mp4",
    "destination": "s3://my_bucket/output_folder",
    "video_basics": {
      "codec": "h264",
      "profile": "high",
      "level": "3.0",
      "frame_rate": "24000/1001"
    },
    "audio_basics": {
      "codec": "aac_lc",
      "sample_rate": 48000,
      "bitrate_kb": 128
    }
  },
  "name": "My job: {{source}}",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "{{source}}"
          }
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "payload": {
          "location": {
            "storage_provider": "s3",
            "path": "destination"
          },
          "targets": [
            {
              "file_pattern": "{source_basename}_1200kbps.mp4",
              "existing_files": "replace",
              "container": {
                "kind": "mp4"
              },
              "video": {
                "$inherits": "video_basics",
                "width": 1280,
                "height": 720,
                "bitrate_kb": 1200
              },
              "audio": [
                {
                  "$inherits": "audio_basics",
                  "channels": 2
                }
              ]
            }
          ]
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_task"
            }
          ]
        }
      }
    ]
  }
}

The Definitions object allows you to create global string replacements in the Job JSON. This can be useful for simplifying editing or parameter replacements in the JSON. If a string such as "my_string" is defined in the Definitions section, then Hybrik will replace every occurrence of {{my_string}} in the rest of the Job JSON with the value of "my_string" in the Definitions section. Using the Job Definition example, every occurrence of {{destination}} in the Job JSON would be replaced with the path defined at the top. If you need to insert the contents of an object in the definitions into an object, use the "$inherits" label. This can be particularly helpful when dealing with multi-layer outputs, since a setting can be changed in one location and affect all of the output layers. When submitting jobs via the API, you should consider putting all of the parameters that will be replaced by the automation system into the definition section. This makes it instantly visible which parameters need to be replaced and makes swapping out parameters simple. It is much easier to reference definitions.bitrate than it is to reference elements[1].transcode.payload.targets[0].video.bitrate_kb!

Elements Array

The contents of the Elements Array are the components of your workflow. The available elements include: Source, Transcode, Analyze, QC, Notify, Copy, Package, Folder Enum, Watchfolder, DPP Package, BIF Creator, and Script. You can have multiple versions of the same type of element. For example, you could have 2 copies of the Analyze and QC tasks -- one for checking your incoming data and one for checking your outgoing results. These could be checking completely different parameters. Each Element has a UID that uniquely identifies it in the workflow.

Example Elements Array

{
  "elements": [
    {
      "uid": "source_file",
      "kind": "source",
      "payload": {
        "kind": "asset_url",
        "payload": {
          "storage_provider": "s3",
          "url": "s3://hybrik-examples/public/sources/sample1.mp4"
        }
      }
    },
    {
      "uid": "transcode_task",
      "kind": "transcode",
      "payload": {
        "location": {
          "storage_provider": "s3",
          "path": "s3://hybrik-examples/public/output/transcode/example1"
        },
        "targets": [
          {
            "file_pattern": "{source_basename}.mp4",
            "existing_files": "replace",
            "container": {
              "kind": "mp4"
            },
            "video": {
              "codec": "h264",
              "width": 640,
              "height": 360,
              "frame_rate": 23.976,
              "bitrate_kb": 600
            },
            "audio": [
              {
                "codec": "heaac_v2",
                "channels": 2,
                "sample_rate": 44100,
                "bitrate_kb": 128
              }
            ]
          }
        ]
      }
    }
  ]
}
Name Type Description
uid string A unique ID for the task element. An example would be "source_file" or "transcode_task_1". This UID allows the task to be uniquely referenced by other parts of the job.
kind enum              
source
transcode
copy
analyze
qc
notify
package
folder_enum
watchfolder
dpp_packager
bif_creator
script
The type of task element.
task object An object describing the generic task behavior, such as priority and number of retries.
payload object The payload describes the parameters of the specific element. Only one type of payload is allowed. The options are:
source
transcode
copy
analyze
qc
notify
package
folder_enum
watchfolder
dpp_packager
bif_creator
script

Connections Array

The Connections Array tells Hybrik how to "connect" the Elements in your workflow. Some items will execute in series, while others can execute in parallel. Some tasks will execute when the previous task completes successfully, and some you will only want to execute when the previous task fails. This sequencing and flow control are managed by the Connections Array. The Elements are referred to by their UID.

Example Connections Array

{
  "from": [
    {
      "element": "transcode_task"
    }
  ],
  "to": {
    "success": [
      {
        "element": "copy_task"
      }
    ],
    "error": [
      {
        "element": "error_notify"
      }
    ]
  }
}
Name Type Description
from array An array that lists each Element that is being connected to this item.
to object An object that defines where this Element is connected to. There are a "success" and "error" array components. Only one of these sets of connections will be triggered upon completion.

Task Object

Example Task Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "task": {
    "retry_method": "fail",
    "name": "Test Transcode For Distribution",
    "priority": 200
  },
  "payload": {
    /* transcode payload goes here */
  }
}
Name Type Description
name string Optional. A name for the task. This will be displayed in the Task window. It does not have to be unique, but it helps to search for specific tasks when they are given unique names. If left blank, Hybrik will automatically generate a task name based on the job name.
tags array Optional. A list of job/task tags. Tags are custom strings that are used to match jobs and tasks to specific computing groups.
retry_method enum             
fail
retry
Optional. A task can be retried automatically. If this is set to "retry", then the retry object must be defined.
default: fail
retry object Defines how many times a retry should be attempted and how many seconds to wait between each attempt.
priority integer Optional. If undefined, all tasks take on the priority of the parent job. The priority of a task (1 = lowest, 254 = highest)
comment string Optional. The user-defined comment about a task. This is only accessible via the API.
extended object Optional. The extended properties for a task.

Retry Object

Example Retry Object

{
  "task": {
    "retry_method": "retry",
    "retry": {
      "count": 2,
      "delay_sec": 30
    }
  }
}

Name Type Description
count integer Maximum number of retries.
maximum: 5
delay_sec integer Optional. Number of seconds to wait after a failure until a retry is attempted.
maximum: 3600
default: 45

Source Element

Source

The Source object defines the source of your workflow. In its simplest form, the Source object points to a single file. But your Source can actually be much more complex. For example, your source may consist of multiple components -- video, audio tracks, and subtitle files. Or your source may actually be multiple sources all being stitched into a final output. Or you may have a source file that has 6 discrete mono audio tracks that you need to treat like 6 channels of a single surround track. The Source object lets you handle all of these scenarios. The Sources object defines what files to use, where to find them, how to access them, and how to assemble them.

Example Source Object

{
  "uid": "my_source",
  "kind": "asset_url",
  "payload": {
    "storage_provider": "s3",
    "url": "s3://hybrik-examples/public/sources/sample1.mp4"
  }
}
Name Type Description
uid string A unique identifier for this element.
kind enum                   
asset_url
asset_complex
The type of source file. An "asset_url" is a single element, whereas an "asset_complex is an asset made up of multiple elements.
payload object The payload for the particular source type. The payload types are:
asset_url
asset_complex

Asset URL

Example Asset URL Payload Object

{
  "uid": "my_source",
  "kind": "asset_url",
  "payload": {
    "storage_provider": "s3",
    "url": "s3://hybrik-examples/public/sources/sample1.mp4"
  }
}

Name Type Description
storage_provider enum             
s3
gs
ftp
sftp
http
swift
swiftstack
akamains
relative
The type of file access.
url string The complete URL for the file location.
access anyOf         
s3
gs
ftp
sftp
http
swift
swiftstack
akamains
This contains credentials granting access to the location.

Asset Complex

Example Asset_Complex Object

{
  "uid": "my_complex_source",
  "kind": "source",
  "payload": {
    "kind": "asset_complex",
    "payload": {
      "kind": "sequence",
      "location": {
        "storage_provider": "s3",
        "path": "s3://my_bucket/my_folder"
      },
      "asset_versions": [
        {
          "version_uid": "intro",
          "asset_components": [
            {
              "kind": "name",
              "component_uid": "file1",
              "name": "intro_video.mp4",
              "trim": {
                "inpoint_sec": 0,
                "duration_sec": 10
              }
            }
          ]
        },
        {
          "version_uid": "main",
          "asset_components": [
            {
              "kind": "name",
              "component_uid": "file2",
              "name": "main_video.mov"
            }
          ]
        }
      ]
    }
  }
}

Asset_complex is used for defining multi-part sources. These can include multiple video, audio, and subtitle files. In the JSON, you will see two arrays, one called "asset_versions" and one called "asset_components". These exist so that you can create much more complex operations. Each element of the "asset_versions" array specifies one source asset that will be used. Because a source may not actually be a single file but rather a collection of video tracks, audio tracks, and subtitle tracks, each element of the "asset_versions" array contains an "asset_components" array. The "asset_components" array is where you would specify the various components.

You will also notice that each element in the "asset_components" array has a "kind" value. In our previous example, we used "name" as the value for "kind". Other values include "list", "template", "image_sequence", "binary_sequence", and "asset_sequence". These are used when you have sources of different types. For example, if you had an asset that consisted of thousands of .png files, called "animation0001.png", "animation0002.png", etc. that you wanted to transcode into a single output, you could specify it as shown on the right.

Example Still Image Source Object

{
  "uid": "source_file",
  "kind": "source",
  "payload": {
    "kind": "asset_complex",
    "payload": {
      "asset_versions": [
        {
          "location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_folder"
          },
          "asset_components": [
            {
              "kind": "image_sequence",
              "image_sequence": {
                "base": "animation%04d.png"
              }
            }
          ]
        }
      ]
    }
  }
}

Transcode Task

Transcode

Example Transcode Object

{
  "name": "Hybrik Transcode Example",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_input_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "payload": {
          "location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_output_folder"
          },
          "targets": [
            {
              "file_pattern": "{source_basename}_converted.mp4",
              "existing_files": "replace",
              "container": {
                "kind": "mp4"
              },
              "video": {
                "width": 1280,
                "height": 720,
                "codec": "h264",
                "profile": "high",
                "level": "4.0",
                "frame_rate": 23.976
              },
              "audio": [
                {
                  "codec": "aac",
                  "channels": 2,
                  "sample_rate": 48000,
                  "sample_size": 16,
                  "bitrate_kb": 128
                }
              ]
            }
          ]
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_task"
            }
          ]
        }
      }
    ]
  }
}

The Transcode Task allows you to specify the type(s) of transcode that you would like to perform in a job. A single Transcode Task can specify more than one output Target. For example, a single Transcode Task can create all 10 layers of an HLS package in a single task. All of the transcode targets will be processed at the same time – meaning there will be one decode pipeline feeding all of the outputs. Therefore if you have 10 output targets, the fastest and the slowest will both complete at the same time, since they are being fed by the same decode pipeline. You can tell Hybrik to break up a Transcode Task into separate tasks to be run on multiple machines.

transcode

Name Type Description
location object The base target location. Locations specified in deeper branches of this JSON will override this.
options object Options for this transcode. Includes source delete option and source pre-fetch.
source_pipeline object The source_pipeline sets filtering or segmenting prior to executing the transcode.
watermarking array An array of objects defining the types of watermarks to be included in the output.
support_files array Support files referenced inside of container/video/audio, for example in x265_options.
targets array An array of target outputs. Each target specifies a location, container, video, and audio properties.

Options

Example Options Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder"
    },
    "options": {
      "delete_sources": true
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
        },
        "video": {
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}

transcode.options

Name Type Description
delete_sources boolean
Set to delete the task's source files on successful execution.
result object Options to modify how results are returned.

Result

transcode.options.result

Example Result Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder"
    },
    "options": {
      "delete_sources": true,
      "result": {
        "mov_atom_descriptor_style": "full"
      }
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
        },
        "video": {
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
mov_atom_descriptor_style enum                          
none
condensed
by_track
full
"none": do not list atoms. "condensed": pick the most important atoms and list linearly with the belonging tracks. "by_track": show the full hierarchy but list along with tracks. "full": show the full file hierarchy in the asset element.

Source_pipeline

Example Source_pipeline Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "source_pipeline": {
      "options": {
        "force_ffr": true,
        "max_decode_errors": 10
      },
      "segmented_rendering": {
        "duration_sec": 180
      }
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "bitrate_kb": 128
          }
        ]
      }
    ]
  }
}

The source_pipeline allows the modification of the source prior to beginning the transcode. An example use case would be where you had 10 output targets, and you wanted each of them to have a logo imprinted in the same relative location. You could apply the imprint filter on the source prior to the creation of the outputs. Another example would be for splitting the processing of an output across many machine. By specifying segmented rendering in the source_pipeline, different segments of the source will be sent to different target machines.

transcode.source_pipeline

Name Type Description
trim anyOf         
by_sec_in_out
by_sec_in_dur
by_timecode
by_asset_timecode
by_frame_nr
by_section_nr
by_media_track
by_nothing
Object defining the type of trim operation to perform on an asset.
ffmpeg_source_args string
The FFmpeg source string to be applied to the source file. Use {source_url} within this string to insert the source file name(s).
options object Options to be used during the decoding of the source.
accelerated_prores boolean
Use accelerated Apple ProRes decoder.
segmented_rendering object Segmented rendering parameters.
manifest_decode_strategy enum                          
simple
reject_complex
reject_master_playlist
Defines the level of complexity allowed when using a manifest as a source.
chroma_dither_algorithm enum                          
none
bayer
ed
a_dither
x_dither
The dithering algorithm to use for color conversions.
scaler object The type of function to be used in scaling operations.

Options

transcode.source_pipeline.options

Example Options Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "source_pipeline": {
      "options": {
        "force_ffr": true,
        "max_decode_errors": 100,
        "max_sequential_decode_errors": 10,
        "resolve_manifest": true
      },
      "segmented_rendering": {
        "duration_sec": 180
      }
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output.mp4",
        "container": {
        },
        "video": {
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
force_ffr boolean
Force Fixed Frame Rate - even if the source file is detected as a variable frame rate source, treat it as a fixed framerate source.
wait_for_source_timeout_sec number Set the maximum time for waiting to access the source data. This can be used to handle data that is in transit.
max_decode_errors integer The maximum number of decode errors to allow. Normally, decode errors cause job failure, but there can be situations where a more flexible approach is desired.
max_sequential_decode_errors integer The maximum number of sequential errors to allow during decode. This can be used in combination with max_decode_errors to set bounds on allowable errors in the source.
no_rewind boolean Certain files may generate A/V sync issues when rewinding, for example after a pre-analysis. This will enforce a reset instead of rewinding.
no_seek boolean Certain files should never be seeked because of potentially occurring precision issues.
low_latency boolean Allows files to be loaded in low latency mode, meaning that there will be no analysis at startup.
cache_ttl integer If a render node is allowed to cache this file, this will set the Time To Live (ttl). If not set (or set to 0) the file will not be cached but re-obtained whenever required.
index_location object Specify a location for the media index file.
resolve_manifest boolean If this is set to true, the file is considered a manifest. The media files referred to in the manifest will be taken as the real source.
master_manifest object Master file to be used for manifest resolution (for example IMF CPLs}.
master_manifests array An array of master files to be used for manifest resolution (for example IMF CPLs).
ignore_errors array Attempt to ignore input errors of the specified types. Error type options include invalid_argument and non_monotonic_dts.
auto_offset_sources boolean If this is set to true, the source is considered starting with PTS 0 regardless of the actual PTS.

Segmented_rendering

transcode.source_pipeline.segmented_rendering

Example Segmented_rendering Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "source_pipeline": {
      "segmented_rendering": {
        "duration_sec": 180
      }
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "bitrate_kb": 128
          }
        ]
      }
    ]
  }
}
Name Type Description
duration_sec number Duration (in seconds) of a segment in segment encode mode.
minimum: 1
pts_zero_base boolean Setting this to true will reset PTS stamps in the stream to a zero-based start.
scene_changes_search_duration_sec number Duration (in seconds) to look for a dominant previous or following scene change. Note that the segment duration can then be up to duration_sec + scene_changes_search_duration_sec long.
total_segments integer Total number of segments.
segment_nr integer Number of this segment.
strict_cfr boolean Combiner will merge and re-stripe transport streams
mux_offset_otb integer Timebase offset to be used by the muxer.

Scaler

transcode.source_pipeline.scaler

Example Scaler Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "source_pipeline": {
      "scaler": {
        "kind": "zscale",
        "config_string": "dither=error_diffusion",
        "apply_always": true
      }
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output.mp4",
        "container": {
        },
        "video": {
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
kind enum                          
default
zscale
The type of scaling to be applied.
default: default
config_string string The configuration string to be used with the specified scaling function.
apply_always boolean Always use the specified scaling function.

Watermarking

Example Watermarking Object

{
  "uid": "transcode_task_pass",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "watermarking": [
      {
        "kind": "nexguard_video",
        "payload": {
          "watermark_strength": "medium",
          "license_manager": {
            "ip": "192.168.21.100",
            "port": 5093
          },
          "warnings_as_errors": {
            "too_short": false
          }
        }
      }
    ],
    "targets": [
      {
        "file_pattern": "{source_basename}_watermarked.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "track_group_id": "V1",
          "codec": "h264",
          "width": 640,
          "height": 360,
          "frame_rate": 23.976,
          "bitrate_kb": 600
        },
        "audio": [
          {
            "track_group_id": "A1",
            "codec": "heaac_v2",
            "channels": 2,
            "sample_rate": 44100,
            "bitrate_kb": 128
          }
        ]
      }
    ]
  }
}

Hybrik supports both audio and video watermarking through technology integrations with third-party companies like Nielsen and Nagra Kudelski. These watermarks are invisible (or inaudible) and can only be detected by a separate process. While Hybrik provides the integration with these third-party components, you must be a customer of these third-parties to actually use them. Please contact Dolby for more information regarding these technologies.

transcode.watermarking

Name Type Description
kind enum                          
nexguard_video
Defines the watermark type.
default: nexguard_video
payload anyOf          The payload for the specific watermark.

Targets

Example Targets Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "my_first_output_2tracks.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264",
          "profile": "high",
          "level": "4.0",
          "frame_rate": "24000/1001"
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 1,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 96
          },
          {
            "codec": "aac",
            "channels": 1,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 96
          }
        ]
      },
      {
        "file_pattern": "my_second_output_6channels.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mov"
        },
        "video": {
          "width": 1920,
          "height": 1080,
          "codec": "h264",
          "profile": "high",
          "level": "4.0",
          "frame_rate": "30000/1001"
        },
        "audio": [
          {
            "codec": "aac_lc",
            "channels": 6,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 256
          }
        ]
      }
    ]
  }
}

Each output target is defined by an element in the "targets" array. Each target will use the same source, but can have completely different output parameters. If a particular parameter (e.g. "frame_rate") is not specified for a target, then the source's value for that parameter will be used.

transcode.targets

Name Type Description
uid string A UID (arbitrary string) to allow referencing this target. This UID may be used, for example, to specify a target for preview generation. User supplied, must be unique within an array of targets.
manifest_uids array An array of UIDs defining the manifests that this target belongs to. A target may belong to one or more manifests.
processing_group_ids array Allows target selection for subsequent tasks.
location object A location that overrides any location defined within the parents of this encode target.
file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
trim anyOf         
by_sec_in_out
by_sec_in_dur
by_timecode
by_asset_timecode
by_frame_nr
by_section_nr
by_media_track
by_nothing
Object defining the type of trim operation to perform on an asset.
existing_files enum                          
delete_and_replace
replace
replace_late
rename_new
rename_org
fail
The desired behavior when a target file already exists. "replace": will delete the original file and write the new one. "rename_new": gives the new file a different auto-generated name. "rename_org": renames the original file. Note that renaming the original file may not be possible depending on the target location. "delete_and_replace": attempts to immediately delete the original file. This will allow for fast failure in the case of inadequate permissions. "replace_late": does not attempt to delete the original -- simply executes a write.
default: fail
force_local_target_file boolean
This will enforce the creation of a local file and bypass in-stream processing. Only used in scenarios where in-stream processing is impossible due to format issues.
include_if_source_has array This array allows for conditionally outputting tracks based on whether or not a specific input track exists. The tracks in the source are referred to by number reference: audio[0] refers to the first audio track.
include_conditions array Specifies conditions under which this output will be created. Can use Javascript math.js nomenclature
size_estimate number
string
Setting a size_estimate can help in allocating the right amount of temporary local storage. Omit if this value if it cannot be guessed with a +/- 5% certainty.
ffmpeg_args string
The FFmpeg command line to be used. Any properties defined in the target section of this JSON will override FFmpeg arguments defined here.
ffmpeg_args_compliance enum                          
strict
relaxed
minimal
late
late_relaxed
late_minimal
Hybrik will interpret a ffmpeg command line. Relaxed will allow also unknown or conflicting ffmpeg options to pass. late_* will reolve this at render time and preserve the original ffmpeg_args in the JSON.
mp4box_args string
The post processing mp4box command line to be used. If omitted, mp4box will not be invoked.
hybrik_encoder_args string
The Hybrik encoder arguments line to be used. May overwrite default arguments.
nr_of_passes integer
string
This specifies how many passes the encode will use.
minimum: 1
maximum: 2
default: 1
slow_first_pass boolean
h264/h265: enables a slow (more precise) first pass.
compliance enum                          
xdcam_imx
xdcam_hd
xdcam_hd_422
xdcam_proxy
xdcam_dvcam25
avcintra
dvb
atsc
xavc
This setting can be used to force output compliance to a particular standard, for example XDCAM-HD.
compliance_enforcement enum                          
strict
relaxed
Defines whether the output compliance will be strict or relaxed. Relaxed settings allow parameters to be overridden in the JSON.
container object The transcoding container parameters.
video object The video parameters for the target output.
audio array Array defining the audio tracks in the output. Each element of the array is an object defining the track parameters, including codec, number of channels, bitrate, etc.
timecode array An array defining the timecode tracks in this output.
subtitle array Array defining the subtitle tracks in the output.

Container

Example Container Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output.{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mpeg2ts"
        },
        "video": {
          "codec": "h264",
          "bitrate_mode": "cbr",
          "bitrate_kb": 1000,
          "profile": "main",
          "level": "4.0",
          "height": 720
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 128
          }
        ]
      }
    ]
  }
}

transcode.targets.container

Name Type Description
verify boolean Enable or disable post transcode verification for this track.
default: true
kind enum                          
copy
avi
hls
hls_subtitle
dash-mp4
dash-vod
dash-live
dash-segment
mp4
fmp4
segmented_mp4
mpegts
segmented_ts
mpeg2ts
mov
mxf
mxf_d10
webm
mkv
nut
ismv
3gp
mpeg2video
mp1_system
mp2_program
mp1_elementary
mp2_elementary
vob
dvd
aac
mp3
wav
aiff
alac
aa
flac
ogg
jpg
ass
srt
stl
ttml
imsc1
webvtt
elementary
dvbsub
scc
The container (i.e. multiplexing) format.
vendor enum                          
ap10
Certain formats (such as .mov and .mp4) can have a vendor string.
movflags string The FFmpeg movflags. See https://www.ffmpeg.org/ffmpeg-formats.html for more information.
muxrate_kb integer The multiplexer rate - only valid for MPEG transport streams. Omit to keep M2TS padding at a minimum.
mux_warnings enum                          
as_errors
ignore
Allow to complete jobs which otherwise are failing based on multiplexer warnings.
faststart boolean Enable progressive download for .mov and .mp4 files.
default: true
transport_id integer Set the TS Transport ID - only used for MPEG transport streams.
maximum: 8190
use_sdt boolean switch SDT for mpegts on/off.
pcr_pid integer Set the PCR PID - only used for MPEG transport streams.
maximum: 8190
pmt_pid integer Set the PMT PID - only used for MPEG transport streams.
maximum: 8190
pcr_interval_ms integer Set the PCR interval - only used for MPEG transport streams.
minimum: 20
maximum: 1000
pmt_interval_ms integer Set the PMT interval - only used for MPEG transport streams.
minimum: 20
maximum: 1000
pat_interval_ms integer Set the PAT interval - only used for MPEG transport streams.
minimum: 20
maximum: 1000
ts_offset_ms number Set the ts offset in the output file.
segment_duration_sec number
The segment duration in seconds for segmented or fragmented streams such as HLS or mp4/MPEG-DASH. Decimal notation (e.g. 5.5) is supported.
vframe_align_segment_duration boolean
Ensure segment duration is an integer multiple of the frame duration.
default: true
auto_speed_change_delta_percent number
If the frame rate delta is larger than this value, do not attempt to speed-change. Default: just allows 29.97->30 and 23.97<->24 speed changes
align_to_av_media boolean For subtitles only, align duration and segmenting to A/V media time.
hevc_codec_id_prefix string Overrides the codec identifier for HEVC.
references_location object The location of payload files for containers having external references.
encryption object DRM and encryption settings for the produced media files.
title string An optional title. Note that not all multiplexers support adding a title.
author string An optional author. Note that not all multiplexers support adding an author.
copyright string An optional copyright string. Note that not all multiplexers support adding a copyright string.
info_url string An optional info URL string. Note that not all multiplexers support adding a URL.
filters array An array defining the filters to be applied at the container level.
attributes array Container attributes. The specific meaning depends on container format. For dash-mp4 for example, these can be mpd xpath replacements.
enable_data_tracks boolean By default, data tracks, such as time code, in mov are disabled/unchecked. This will enable all such tracks.
mov_atoms object Override container or all-track MOV atoms.
dvd_compatible boolean Enables constraints making enhancing DVD compatibility. Applies to kind='mp2_program/dvd/vob' only.
brand string Setting the ftyp of a mp4/mov/3g file. Example: '3gp5'.
compatible_brands array Appending to compatible ftyp(s) of a mp4/mov/3g file. Example: '["3gp5"]'.
forced_compatible_brands array Replacing the compatible ftyp(s) of a mp4/mov/3g file. Example: '["3gp5"]'.
mainconcept_mux_profile enum                          
VCD
SVCD
DVD
DVD_MPEG1
DVD_DVR
DVD_DVR_MPEG1
DVD_PVR
DVD_PVR_MPEG1
DTV
DVB
MMV
DVHS
ATSC
ATSCHI
CABLELABS
ATSC_C
HDV_HD1
HDV_HD2
D10
D10_25
D10_30
D10_40
D10_50
HD_DVD
MainConcept multiplexer profile. See the MainConcept documentation for details.
mainconcept_mux_options string
Provide direct instruction to the MainConcept multiplexer. Values are constructed as "prop=val,prop=val". See MainConcept documentation for valid values.
scte35 oneOf         
scte35_in_source
scte35_in_sidecar
scte35_in_json
Settings to control the insertion of SCTE35 markers into the output.

Encryption

transcode.targets.container.encryption

Example Encryption Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output.{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mp4",
          "encryption": {
              "enabled": true,
            "schema": "mpeg-cenc",
            "drm": [
              "playready"
            ],
            "key_id": "[32 char hex sequence]",
            "key": "[32 char hex sequence]",
            "playready_pssh": "[base-64 encoded pssh]...=="
          }
        },
        "video": {
          "codec": "h264",
          "bitrate_mode": "vbr",
          "bitrate_kb": 1000,
          "max_bitrate_kb": 1200,
          "profile": "main",
          "level": "4.0",
          "height": 720
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 128
          }
        ]
      }
    ]
  }
}
Name Type Description
enabled boolean Enable or disable encryption.
schema enum                          
aes-128-cbc
sample-aes
mpeg-cenc
mpeg-cbc1
mpeg-cens
mpeg-cbcs
none
The chosen encryption schema. Encryption keys will be generated by Hybrik.
default: aes-128-cbc
drm array An array specifying the types of DRM that will be used.
rotation integer
The encryption rotation interval. Every N file segments, a new encryption key will be generated.
default: 12
key_location object The optional key location. This will override any location defined within the parent of this task.
key_file_pattern string
This describes the key file name. Placeholders such as {source_basename} for source file name are supported.
key string The actual key, if pre-supplied.
iv string The initialization vector, if pre-supplied.
key_id string The Key ID. Used for MPEG-CENC only.
content_id string The Content ID. Used for MPEG-CENC only.
widevine_provider string The Widevine provider.
widevine_pssh string A Widevine PSSH string.
playready_url string The PlayReady licensing authority URL.
playready_pssh string A PlayReady PSSH string.
fairplay_uri string The FairPlay URI for the HSL URI attribute.
clearkey_pssh_version integer The PSSH box version for CENC.

Container Filters

Example Container Filters Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}.mp4",
       "container": {
          "kind": "mp4",
          "filters": [
            {
              "speed_change": {
                "factor": 1.1,
                "pitch_correction": true
              }
            }
          ]
        },
        "video": {
        },
        "audio": [
        ]
      }
    ]
  }
}

Container filters affect both the audio and video components of the output.

transcode.targets.container.filters

Name Type Description
kind enum                          
speed_change
fade
Specifies the type of container filter to be applied.
default: speed_change
include_conditions array Specifies conditions under which this filter will be applied. Can use Javascript math.js nomenclature.
payload anyOf         
speed_change
fade
The payload for the container filter.

Speed_change

transcode.targets.container.filters.speed_change

Example Speed_change Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}.mp4",
       "container": {
          "kind": "mp4",
          "filters": [
            {
              "speed_change": {
                "factor": 2.0,
                "pitch_correction": false
              }
            }
          ]
        },
        "video": {
        },
        "audio": [
        ]
      }
    ]
  }
}
Name Type Description
factor number
string
The speed change to be applied. The default is 1.0. Can use expressions such as (24000/1001)/25
default: 1
pitch_correction boolean
Correct the audio pitch of the speed changed streams.
default: true

Fade

transcode.targets.container.filters.fade

Example Fade Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}.mp4",
       "container": {
          "kind": "mp4",
          "filters": [
            {
              "fade": {
                "mode": "in",
                "start_sec": 0,
                "duration_sec": 2
              }
            }
          ]
        },
        "video": {
        },
        "audio": [
        ]
      }
    ]
  }
}
Name Type Description
mode enum                          
in
out
The fade mode, in or out.
default: in
start_sec number The fade start time in seconds.
duration_sec number The fade duration in seconds.

Attributes

transcode.targets.container.attributes

Example Attributes Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}.mp4",
        "container": {
          "kind": "mp4",
          "attributes": [
            {
              "name": "custom_tag1",
              "value": "custom_value1"
            },
            {
              "name": "custom_tag2",
              "value": "custom_value2"
            }
          ]
        },
        "video": {
        },
        "audio": [
        ]
      }
    ]
  }
}
Name Type Description
name string The name component of a name/value pair.
value string The value component of a name/value pair.

Mov_atoms

transcode.targets.container.mov_atoms

Example Mov_atoms Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}.mp4",
        "container": {
          "kind": "mov",
          "mov_atoms": {
            "no_empty_elst": true,
            "no_negative_cts": true
          }
        },
        "video": {
        },
        "audio": [
        ]
      }
    ]
  }
}
Name Type Description
no_empty_elst boolean Avoid writing an initial elst entry in edts.
no_negative_cts boolean Avoiding negative cts (in mov and mp4) to support quicktime 7.* and lower.

Scte35_in_source

transcode.targets.container.scte35_in_source

Example Scte35_in_source Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_scte35.ts",
        "existing_files": "replace",
        "container": {
          "kind": "mpegts",
          "scte35": {
            "write_scte35_packets": true,
            "use_source_media": "if_exists"
          },
          "pmt_pid": 480,
          "transport_id": 1,
          "muxrate_kb": 2860
        },
        "video": {
          "codec": "mpeg2",
          "pid": 481,
          "width": 720,
          "height": 480,
          "max_bframes": 2,
          "frames": 90
        },
        "audio": [
          {
            "pid": 482,
            "codec": "ac3",
            "channels": 2,
            "sample_rate": 48000,
            "bitrate_kb": 224,
            "language": "eng"
          }
        ]
      }
    ]
  }
}
Name Type Description
use_source_media enum                          
required
if_exists
Use the SCTE35 data in the source.
write_scte35_packets boolean This setting may be set to false in order to insert i-frames at the locations defined by the SCTE35 metadata but not actually insert the SCTE35 packets.
default: true

Scte35_in_sidecar

transcode.targets.container.scte35_in_sidecar

Example Scte35_in_sidecar Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_scte35.ts",
        "existing_files": "replace",
        "container": {
          "kind": "mpegts",
          "scte35": {
            "write_scte35_packets": true,
            "splicepoint_file": {
              "storage_provider": "s3",
              "path": "s3://my_bucket/my_folder/my_splicepointfile.json"
            }
          },
          "pmt_pid": 480,
          "transport_id": 1,
          "muxrate_kb": 2860
        },
        "video": {
          "codec": "mpeg2",
          "pid": 481,
          "width": 720,
          "height": 480,
          "max_bframes": 2,
          "frames": 90
        },
        "audio": [
          {
            "pid": 482,
            "codec": "ac3",
            "channels": 2,
            "sample_rate": 48000,
            "bitrate_kb": 224,
            "language": "eng"
          }
        ]
      }
    ]
  }
}
Name Type Description
splicepoint_file object The location of the file with the insertion points for the SCTE35 markers.
write_scte35_packets boolean This setting may be set to false in order to insert i-frames at the locations defined by the SCTE35 metadata but not actually insert the SCTE35 packets.
default: true

Scte35_in_json

transcode.targets.container.scte35_in_json

Example Scte35_in_json Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_scte35.ts",
        "existing_files": "replace",
        "container": {
          "kind": "mpegts",
          "scte35": {
            "write_scte35_packets": true,
            "scte35_sections": [
              {
                "insertTime": 180000,
                "spliceInfoSection": {
                  "spliceInsert": {
                    "spliceEventId": 8,
                    "program": {
                      "spliceTime": {
                        "ptsTime": 360000
                      }
                    }
                  }
                }
              },
              {
                "insertTime": 450000,
                "spliceInfoSection": {
                  "spliceInsert": {
                    "spliceEventId": 9,
                    "program": {
                      "spliceTime": {
                        "ptsTime": 630000
                      }
                    }
                  }
                }
              },
              {
                "insertTime": 810000,
                "spliceInfoSection": {
                  "spliceInsert": {
                    "spliceEventId": 10,
                    "program": {
                      "spliceTime": {
                        "ptsTime": 900000
                      }
                    }
                  }
                }
              }
            ]
          },
          "pmt_pid": 480,
          "transport_id": 1,
          "muxrate_kb": 2860
        },
        "video": {
          "codec": "mpeg2",
          "pid": 481,
          "width": 720,
          "height": 480,
          "max_bframes": 2,
          "frames": 90
        },
        "audio": [
          {
            "pid": 482,
            "codec": "ac3",
            "channels": 2,
            "sample_rate": 48000,
            "bitrate_kb": 224,
            "language": "eng"
          }
        ]
      }
    ]
  }
}
Name Type Description
scte35_sections array An array describing the insertion points for the SCTE35 markers. Each element in the array describes the parameters of the SCTE35 marker.
write_scte35_packets boolean This setting may be set to false in order to insert i-frames at the locations defined by the SCTE35 metadata but not actually insert the SCTE35 packets.
default: true

Scte35_sections

transcode.targets.container.scte35_in_json.scte35_sections

Example Scte35_sections Object

{
  "file_pattern": "{source_basename}_scte35.ts",
  "existing_files": "replace",
  "container": {
    "kind": "mpegts",
    "scte35": {
      "write_scte35_packets": true,
      "scte35_sections": [
        {
          "insertTime": 180000,
          "spliceInfoSection": {
            "spliceInsert": {
              "spliceEventId": 8,
              "program": {
                "spliceTime": {
                  "ptsTime": 360000
                }
              }
            }
          }
        },
        {
          "insertTime": 450000,
          "spliceInfoSection": {
            "spliceInsert": {
              "spliceEventId": 8,
              "program": {
                "spliceTime": {
                  "ptsTime": 630000
                }
              }
            }
          }
        }
      ],
      "pmt_pid": 480,
      "transport_id": 1,
      "muxrate_kb": 2860
    }
  }
}
Name Type Description
insertTime integer The insertion time for the SCTE35 marker.
spliceInfoSection object The splice information for a specific ad.

SpliceInfoSection

transcode.targets.container.scte35_in_json.scte35_sections.spliceInfoSection

Example SpliceInfoSection Object

{
  "file_pattern": "{source_basename}_scte35.ts",
  "existing_files": "replace",
  "container": {
    "kind": "mpegts",
    "scte35": {
      "write_scte35_packets": true,
      "scte35_sections": [
        {
          "insertTime": 180000,
          "spliceInfoSection": {
            "spliceInsert": {
              "spliceEventId": 8,
              "program": {
                "spliceTime": {
                  "ptsTime": 360000
                }
              }
            }
          }
        },
        {
          "insertTime": 450000,
          "spliceInfoSection": {
            "spliceInsert": {
              "spliceEventId": 8,
              "program": {
                "spliceTime": {
                  "ptsTime": 630000
                }
              }
            }
          }
        }
      ],
      "pmt_pid": 480,
      "transport_id": 1,
      "muxrate_kb": 2860
    }
  }
}
Name Type Description
spliceInsert object The splice information for a specific insertion point.

SpliceInsert

transcode.targets.container.scte35_in_json.scte35_sections.spliceInfoSection.spliceInsert

Example SpliceInsert Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_scte35.ts",
        "existing_files": "replace",
        "container": {
          "kind": "mpegts",
          "scte35": {
            "write_scte35_packets": true,
            "scte35_sections": [
              {
                "insertTime": 180000,
                "spliceInfoSection": {
                  "spliceInsert": {
                    "spliceEventId": 8,
                    "program": {
                      "spliceTime": {
                        "ptsTime": 360000
                      }
                    }
                  }
                }
              }
            ]
          }
        },
        "video": {
          "codec": "mpeg2",
          "width": 720,
          "height": 480
        },
        "audio": [
          {
            "codec": "ac3",
            "channels": 2,
            "sample_rate": 48000,
            "bitrate_kb": 224
          }
        ]
      }
    ]
  }
}
Name Type Description
spliceEventId integer The ID for the splice event.
spliceEventCancelIndicator boolean In a broadcast, indicates whether a specific insertion has been cancelled.
outOfNetworkIndicator boolean True indicates cue-out from the network (the start of an ad). False indicates cue-in from the ad to the network.
programSpliceFlag boolean Setting this flag to true indicates Program Splice Mode, where setting it to false indicates a Component Splice Mode.
uniqueProgramId integer A unique identifier for the viewing event.
availNum integer An identification for a specific avail within one Unique Program ID.
availsExpected integer The count for the expected number of individual avails within the current viewing event. If this field is set to zero, then the availNum filed is ignored.
program object Object to specify the spliceTime of the Program.

Program

transcode.targets.container.scte35_in_json.scte35_sections.spliceInfoSection.spliceInsert.program

Example Program Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_scte35.ts",
        "existing_files": "replace",
        "container": {
          "kind": "mpegts",
          "scte35": {
            "write_scte35_packets": true,
            "scte35_sections": [
              {
                "insertTime": 180000,
                "spliceInfoSection": {
                  "spliceInsert": {
                    "spliceEventId": 8,
                    "program": {
                      "spliceTime": {
                        "ptsTime": 360000
                      }
                    }
                  }
                }
              }
            ]
          }
        },
        "video": {
          "codec": "mpeg2",
          "width": 720,
          "height": 480
        },
        "audio": [
          {
            "codec": "ac3",
            "channels": 2,
            "sample_rate": 48000,
            "bitrate_kb": 224
          }
        ]
      }
    ]
  }
}
Name Type Description
spliceTime object The Program spliceTime.

SpliceTime

transcode.targets.container.scte35_in_json.scte35_sections.spliceInfoSection.spliceInsert.program.spliceTime

Example SpliceTime Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_scte35.ts",
        "existing_files": "replace",
        "container": {
          "kind": "mpegts",
          "scte35": {
            "write_scte35_packets": true,
            "scte35_sections": [
              {
                "insertTime": 180000,
                "spliceInfoSection": {
                  "spliceInsert": {
                    "spliceEventId": 8,
                    "program": {
                      "spliceTime": {
                        "ptsTime": 360000
                      }
                    }
                  }
                }
              }
            ]
          }
        },
        "video": {
          "codec": "mpeg2",
          "width": 720,
          "height": 480
        },
        "audio": [
          {
            "codec": "ac3",
            "channels": 2,
            "sample_rate": 48000,
            "bitrate_kb": 224
          }
        ]
      }
    ]
  }
}
Name Type Description
ptsTime integer The spliceTime in PTS.

Video

Example Video Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "width": 1920,
          "height": 1080,
          "bitrate_kb": 6000,
          "max_bitrate_kb": 8000,
          "bitrate_mode": "vbr"
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 2,
            "sample_rate": 48000
          }
        ]
      }
    ]
  }
}

transcode.targets.video

Name Type Description
include_if_source_has array This array allows for conditionally outputting tracks based on whether or not a specific input track exists. The tracks in the source are referred to by number reference: audio[0] refers to the first audio track.
include_conditions array Include this target output if these conditions are met.
enabled boolean Enable or disable video in output.
default: true
verify boolean Enable or disable post transcode verification for this track.
default: true
codec enum                          
copy
h264
h265
prores
mpeg1
mpeg2
vp8
vp9
av1
dv25
dv50
dnxhd
mjpeg
jpeg2000
raw
png
jpeg
The desired output video codec.
pid integer
The video program ID - only used for MPEG transport streams.
maximum: 8190
track_group_id string This indicates which Group this track belongs to. Multiple tracks with the same content but different bitrates would have the same track_group_id.
layer_id string This indicates which Layer this tracks belongs to. For example, this allows bundling one video layer and multiple audio layers with same bitrates but different languages.
layer_affinities array This indicates which other layers this layer can be combined with. For example, to combine audio and video layers.
width number
Width of the output video.
minimum: 8
maximum: 8192
height number
Height of the output video.
minimum: 8
maximum: 8192
width_modulus integer
If width is calculated automatically (from aspect ratio and height for example), then only allow truncated integer multiples of this value.
minimum: 1
maximum: 64
default: 2
height_modulus integer
If height is calculated automatically (from aspect ratio and width for example), then only allow truncated integer multiples of this value.
minimum: 1
maximum: 64
default: 2
frame_rate number
string
The video frame rate - can be expressed in decimal or fraction notation, examples: 29.97, 30000/1001
par number
string
The pixel aspect ratio. Optional. May be expressed in decimal or fraction notation, examples: 0.9, 8/9
dar number
string
The display aspect ratio. Optional. May expressed in decimal or fraction notation, examples: 1.33, 4/3
ar_max_distortion number
A small amount of distortion can be allowed in order to minimize letter- or pillar-boxing. The default is 5%.
maximum: 100
default: 0.05
ar_auto_crop enum                          
none
distorted
preserving
If an aspect ratio adjustment needs to occur, setting 'none' here will add full letter/pillar boxes. Selecting 'distorted' will reduce the required padding and cropping, as determined by ar_pad_crop_ratio, by distorting the image with ar_max_distortion. Selecting 'preserving' will do a full aspect ratio correct operation, using the cropping vs. padding ratio from ar_pad_crop_ratio.
default: none
ar_pad_crop_ratio number
For reducing letter and pillar boxing, the video can instead be slightly or fully cropped. Setting a value of 1 here will never crop, and add full letter/pillar boxes. Setting a value of 0 will fully crop with no letter/pillar boxing. If ar_auto_crop is set to 'distorted', then ar_max_distortion is considered prior to calculating the required cropping is determined by ar_pad_crop_ratio. Currently, only 0.0 and 1.0 are supported. The full range will be supported in a future release.
maximum: 1
default: 1.0
video_format enum                          
component
pal
ntsc
secam
mac
unspecified
Video format flag (metadata only).
add_vbi_if_needed boolean
If the video height is 480 or 576, pad the video with 32 top lines without distorting the aspect ratio.
interlace_mode enum                          
progressive
tff
bff
The interlacing mode: progressive, top field first, or bottom field first.
smart_temporal_conversions boolean If source/dest interlacing properties or frame rates differ, automatically apply the best possible conversion.
smart_chroma_conversions boolean If source/dest hd/sd properties or color space/matrix/primary differ, automatically apply the best possible conversion.
chroma_format enum                          
yuv411p
yuv420p
yuv422p
yuv420p10le
yuv422p10le
yuv444p10le
yuva444p10le
yuv420p12le
yuv422p12le
yuv444p12le
yuv420p16le
yuv422p16le
yuv444p16le
yuva444p16le
yuvj420p
yuvj422p
rgb24
rgb48be
rgba64be
rgb48le
rgba64le
The pixel format. Note that not all codecs will support all formats.
ire_range_mode enum                          
auto
full
limited
Chroma coordinate reference of the source primaries. The default is determined by video size.
color_primaries enum                          
bt601
bt709
bt470m
bt470bg
smpte170m
smpte240m
smpte431
smpte432
bt2020
film
Chroma coordinate reference of the source primaries. The default is determined by video size.
color_trc enum                          
bt601
bt709
st2084
bt470m
gamma22
bt470bg
gamma28
smpte170m
smpte240m
smpte428
linear
log
log sqrt
bt1361 ecg
iec61966 2.1
iec61966 2.4
bt2020_10bit
bt2020_12bit
hlg
arib_stdb67
Color transfer characteristics. The default determined by video size.
color_matrix enum                          
rgb
bt470bg
bt601
bt709
smpte170m
smpte240m
bt2020c
bt2020nc
smpte2085
YUV/YCbCr colorspace type.
use_broadcast_safe boolean This will limit signal values to permitted IRE levels (using 7.5..100 IRE).
default: false
force_source_par number
string
This will override the automatically detected source pixel aspect ratio. Can be omitted, or set using a numeric or fractional designation such as 0.9, 8/9, etc.
force_source_dar number
string
This will override the automatically detected source display aspect ratio. Can be omitted or set using one of the following formats: 1.33, 4/3, ...
profile enum                          
baseline
main
main10
main-intra
mainstillpicture
main444-8
main444-intra
main444-stillpicture
main10-intra
main422-10
main422-10-intra
main444-10
main444-10-intra
high
high10
high422
high444
MP
HP
SP
422P
apco
apcs
apcn
apch
ap4h
ap4x
jpeg2000
cinema2k
cinema4k
The profile for your codec. Not all profiles are valid for all codecs.
level enum                          
1.0
1.1
1.2
1.3
2.0
2.1
2.2
3.0
3.1
3.2
4.0
4.1
4.2
5.0
5.1
5.2
6.0
LL
ML
HL
H14
The codec-dependent level - please reference ISO/IEC 14496-10, ISO/IEC 13818-2 etc.
preset enum                          
ultrafast
superfast
veryfast
faster
fast
medium
slow
slower
veryslow
placebo
Codec-dependent preset, applies to h264 and h265 only.
tune enum                          
psnr
ssim
fastdecode
zerolatency
grain
film
animation
stillimage
touhou
Codec-dependent tune option, applies to av1, vp9, h264 and h265 only. Allowed values depend on codec.
use_cabac boolean This will enable context-adaptive binary arithmetic coding for h.264. If not set, the profile/level combination will determine if CABAC is used.
refs integer
The number of h.264 reference frames to used for future frames. If not set, the profile/level combination will determine the proper number of reference frames.
maximum: 16
slices integer
The number of h.264 frame slices.
use_loop_filter boolean Enable h264/h265 loop filters.
x264_options string
x.264 specific codec options - please reference https://sites.google.com/site/linuxencoding/x264-ffmpeg-mapping for an excellent explanation.
x265_options string
x.265 specific codec options.
mainconcept_video_options string
MainConcept specific codec options - please reference the mainconcept codec documentation.
mainconcept_stream_mux_options string
Provide direct stream instruction to the MainConcept multiplexer. Values are constructed as "prop=val,prop=val". See MainConcept documentation for valid values.
ffmpeg_args string
The FFmpeg (target) command line arguments to be used. Note that these will override competing settings in the JSON.
encoder_info boolean x264/265 prevent encoder info string (requires hybrik_ffmpeg version).
bitrate_mode enum                          
cbr
cbr_unconstrained
crf
vbr
cq
The bitrate mode for the codec. The default value depends on the codec being used. "crf" bitrate mode (Constant Rate Factor) is only valid for x.264
bitrate_kb number
The video bitrate in kilobits per second. For vbr, this is the average bitrate.
minimum: 1
maximum: 1000001
min_bitrate_kb number
The minimum video bitrate in kilobits per second. Only valid for crf and vbr.
minimum: 1
maximum: 1000001
max_bitrate_kb number
The maximum video bitrate in kilobits per second. Only valid for crf and vbr.
minimum: 1
maximum: 1000001
vbv_buffer_size_kb number
The vbv buffer size in kilobits.
maximum: 1000000
vbv_init_occupancy_kb number
The vbv init occupancy in kilobits. Important for chunked encoding like HLS.
maximum: 1000000
max_available_vbv number
The maximium vbv fullness (0 = 0%, 1 = 100%).
maximum: 1
min_available_vbv number
The minimum vbv fillness (0 = 0%, 1 = 100%).
maximum: 1
vbv_constraints_failure enum                          
during_pass_1
before_pass_2
after_pass_2
Specify when during the transcode to fail if VBV constraints can not be met.
use_closed_gop boolean Use closed GOPs - not valid for all codecs.
first_gop_closed boolean First GOP only shall be closed - only valid for Mainconcept MPEG2.
max_bframes integer
The maximum number of B frames between I and P frames.
maximum: 100
forced_keyframes object Allows forcing keyframe insertion at specific frames or times.
crf number
The Constant Rate Factor setting for h.264 and h.265. A setting of 18 is considered excellent. A change of plus/minus 6 should half/double the resulting file size. See https://trac.ffmpeg.org/wiki/Encode/H.264
maximum: 63
q number
A setting to determine quality, effect depends on the codec used
qscale number
A setting to determine QScale difference between I and P frames. Codec dependent, see https://www.ffmpeg.org/ffmpeg-codecs.html
qmin number
The video minimum quantizer scale.
minimum: -1
maximum: 69
qmax number
The video maximum quantizer scale.
minimum: -1
maximum: 1024
dc_precision integer
The number of bits to use in calculating the DC component of intra-coded blocks.
minimum: 8
maximum: 10
use_sequence_display_extension boolean This will write the sequence display extension (MPEG2 only).
use_sequence_header_per_gop boolean This will write a sequence header for each gop.
use_intra boolean Set to use only I-frames. Default depends on the codec.
use_intra_vlc boolean Set to use intra-vlc tables only (MPEG2 only).
use_non_linear_quant boolean Set to use non-linear quantizer (MPEG2 only).
use_interlace_encode_mode boolean This determines if the codec shall be forced to perform interlaced (field-separated) encodes. Default: "auto". Not to be confused with interlace_mode.
use_scene_detection boolean For constant GOP length, scene detection needs to be disabled. This will come with a steep penalty on video quality. Keep enabled (the default) if possible.
use_low_delay boolean Instruct the encoder to use low delay encoding modes, exact meaning varies by codec.
rtp_payload_size integer RTP payload size in bytes.
afd number
Set AFD (Active Format Description ) value.
vtag string
Allows overriding the default hev1 or hvc1 tags applied to HEVC content.
track_name string
The name of this video track - will be used for mov files and MPEG-DASH (representation::id) for example. May be ignored, depending on your container format.
closed_captions object Object describing the CC parameters for the targeted output.
mpeg2 object A set of MPEG2-specific options for closed captions and telecine.
mov_atoms object Override video track MOV atoms.
hdr10 object Object describing the HDR10 metadata source location and mastering display characteristics.
image_sequence object Object defining the settings to be used when outputting a sequence of images.
filters array An array of video filters to be applied to the output targets.
scaler object The type of function to be used in scaling operations.

Forced_keyframes

transcode.targets.video.forced_keyframes

Example Forced_keyframes Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "width": 1920,
          "height": 1080,
          "bitrate_kb": 6000,
          "max_bitrate_kb": 8000,
          "bitrate_mode": "vbr",
          "forced_keyframes": {
            "frames": [
              1000,
              10000,
              30000
            ]
          }
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 2,
            "sample_rate": 44100
          }
        ]
      }
    ]
  }
}
Name Type Description
kind enum                          
i
idr
The type of keyframes to be created - i or idr.
frames array An array of frame numbers to use for the keyframe insertion.
times_sec array An array of locations in seconds to use for the keyframe insertion.
timecodes array An array of timecode values to use for the keyframe insertion.

Closed_captions

transcode.targets.video.closed_captions

Example Closed_captions Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "width": 1920,
          "height": 1080,
          "bitrate_kb": 6000,
          "max_bitrate_kb": 8000,
          "bitrate_mode": "vbr",
          "closed_captions": {
            "enable_cea608": true,
            "enable_cea708": true
          }
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 2,
            "sample_rate": 44100
          }
        ]
      }
    ]
  }
}
Name Type Description
suppress boolean Suppress including captions in the output.
packets_per_chunk integer How many CC packets to insert into frames without CC.
enable_scte20 enum                          
true
false
auto
Enable SCTE 20 compatible captions in the output. Set to true, false(default), or "auto".
enable_scte128 enum                          
true
false
auto
Enable SCTE 128 compatible captions. Set to true, false, or "auto"(default)
enable_a53 enum                          
true
false
auto
Enable NTSC A53 compatible captions. Set to true, false, or "auto"(default)
default: auto
enable_quicktime_c608 enum                          
true
false
auto
Enable NTSC 608 compatible captions in the QuickTime output. Set to true, false(default), or "auto"
enable_quicktime_c708 enum                          
true
false
auto
Enable NTSC 708 compatible captions in the QuickTime output. Set to true, false(default), or "auto"
enable_smpte436m enum                          
true
false
auto
Enable SMPTE 436M compatible captions in the MXF output. Set to true, false(default), or "auto"
enable_cea608 enum                          
true
false
auto
Enable NTSC 608 compatible captions in the output. Set to true, false, or "auto"(default)
default: auto
enable_cea708 enum                          
true
false
auto
Enable NTSC 708 compatible captions in the output. Set to true, false, or "auto"(default)
default: auto
cc1_608_to_708_service_mapping integer This allows control over which 708 service will receive the 608 CC1 caption data. The default behavior for CC1 is to be mapped to 708 Service 1.
cc2_608_to_708_service_mapping integer This allows control over which 708 service will receive the 608 CC2 caption data.
cc3_608_to_708_service_mapping integer This allows control over which 708 service will receive the 608 CC3 caption data. The default behavior for CC3 is to be mapped to 708 Service 2.
cc4_608_to_708_service_mapping integer This allows control over which 708 service will receive the 608 CC4 caption data.

Mpeg2

transcode.targets.video.mpeg2

Example Mpeg2 Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.ts",
        "existing_files": "replace",
        "container": {
          "kind": "mpeg2ts"
        },
        "video": {
          "codec": "mpeg2",
          "mpeg2": {
            "cea_608_closed_captions": true,
            "cea_708_closed_captions": true
          },
          "width": 1920,
          "height": 1080,
          "bitrate_kb": 6000,
          "max_bitrate_kb": 8000,
          "bitrate_mode": "vbr"
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 2,
            "sample_rate": 48000
          }
        ]
      }
    ]
  }
}
Name Type Description
soft_telecine boolean This will produce a file with field repeat flags set, playing interlaced @ 29.97. Requires a frame rate of 24000/1001 and the hybrik_3.2 or greater ffmpeg version.

Mov_atoms

transcode.targets.video.mov_atoms

Example Mov_atoms Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mov",
        "existing_files": "replace",
        "container": {
          "kind": "mov"
        },
        "video": {
          "codec": "h264",
          "width": 1920,
          "height": 1080,
          "bitrate_kb": 6000,
          "max_bitrate_kb": 8000,
          "bitrate_mode": "vbr",
          "mov_atoms": {
            "tapt": {
              "clef": "1920:1080",
              "prof": "1920:1080",
              "enof": "1920:1080"
            }
          }
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 2,
            "sample_rate": 48000
          }
        ]
      }
    ]
  }
}
Name Type Description
pasp string Override the PASP atom in the form x:y.
gama number
string
Override the GAMA atom as a float value
fiel string Override the FIEL atom with a pre-defined quicktime API integer value in hex notation (https://developer.apple.com/library/content/documentation/QuickTime/QTFF/QTFFChap3/qtff3.html#//apple_ref/doc/uid/TP40000939-CH205-124374)
tapt object Override the TAPT atom with individual settings.
clap string Override the CLAP atom in the form x:x:x:x:x:x:x:x (8 entries).
use_clap boolean Set this to false to prevent writing the CLAP atom.
default: true
media_uuid string Provide UUID in the form '324D8401-7083-4F5F-A0B1-D768CED82E43'
encoder string Provide STSD encoder string.

Tapt

transcode.targets.video.mov_atoms.tapt

Example Tapt Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mov",
        "existing_files": "replace",
        "container": {
          "kind": "mov"
        },
        "video": {
          "codec": "h264",
          "width": 1920,
          "height": 1080,
          "bitrate_kb": 6000,
          "max_bitrate_kb": 8000,
          "bitrate_mode": "vbr",
          "mov_atoms": {
            "tapt": {
              "clef": "1920:1080",
              "prof": "1920:1080",
              "enof": "1920:1080"
            }
          }
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 2,
            "sample_rate": 48000
          }
        ]
      }
    ]
  }
}
Name Type Description
clef string Override clef in the form width:height
prof string Override prof in the form width:height
enof string Override enof in the form width:height

Hdr10

transcode.targets.video.hdr10

Example Hdr10 Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_hdr10.mp4",
        "existing_files": "replace",
        "nr_of_passes": 2,
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h265",
          "width": 1920,
          "height": 1080,
          "bitrate_kb": 12000,
          "max_bitrate_kb": 13200,
          "vbv_buffer_size_kb": 13200,
          "bitrate_mode": "vbr",
          "chroma_format": "yuv420p10le",
          "profile": "main10",
          "level": "5.0",
          "color_primaries": "bt2020",
          "color_trc": "st2084",
          "color_matrix": "bt2020nc",
          "hdr10": {
            "source": "config",
            "master_display": "G(13250,34500)B(7500,3000)R(34000,16000)WP(15635,16450)L(40000000,47)",
            "max_cll": 4000,
            "max_fall": 0
          }
        }
      }
    ]
  }
}
Name Type Description
source enum                          
config
source_metadata
source_document
media
metadata_file
none
The source for the HDR10 metadata.
master_display string
The mastering display brightness.
max_cll number
The maximum Content Light Level (CLL) for the file.
max_fall number
The maximum Frame Average Light Level (FALL) for the file.

Image_sequence

transcode.targets.video.image_sequence

Example Image_sequence Object

{
  "uid": "transcode_thumbnails",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder/thumbnails"
    },
    "targets": [
      {
        "uid": "thumbnails",
        "file_pattern": "thumb_%05d.jpg",
        "existing_files": "replace",
        "video": {
          "codec": "png",
          "width": 256,
          "height": 144,
          "image_sequence": {
            "total_number": 100,
            "offset_sec": 0
          }
        }
      }
    ]
  }
}
Name Type Description
total_number integer The total number of images to create when outputting a sequence of images.
minimum: 1
start_number integer The starting number for files to use when outputting a sequence of images.
offset_sec number The number of seconds to offset the output.
relative_offset number The relative offset (percentage) to offset the output.

Video Filters

Example Video Filters Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output.mp4",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "bitrate_mode": "vbr",
          "bitrate_kb": 1800,
          "max_bitrate_kb": 2000,
          "height": 720,
          "filters": [
            {
              "kind": "crop",
              "payload": {
                "top": 20,
                "bottom": 20
              }
            },
            {
              "kind": "print_timecode",
              "payload": {
                "x": "(w-lw)/2",
                "y": "h/4",
                "font_size": 20,
                "source_timecode_selector": "gop"
              }
            },
          ]
        },
        "audio": [
          {
            "codec": "aac_lc",
            "channels": 2,
            "bitrate_kb": 96
          }
        ]
      }
    ]
  }
}

transcode.targets.video.filters

Name Type Description
kind enum                          
print_timecode
print_subtitle
crop
gaussian_blur
image_overlay
video_overlay
telecine
deinterlace
fade
color_convert
ffmpeg
Specifies the type of filter being applied.
default: ffmpeg
options object The filter options.
include_conditions array Specifies conditions under which this filter will be applied. Can use Javascript math.js nomenclature
overrides object Object defining parameters to be overrideen in the filter.
payload anyOf         
print_timecode
print_subtitle
crop
ffmpeg
gaussian_blur
image_overlay
video_overlay
telecine
deinterlace
fade
color_convert
The payload of the video filter.

Options

transcode.targets.video.filters.options

Example Options Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "color_convert",
              "options": {
                "position": "pre_convert"
              },
              "payload": {
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
position enum                          
pre_analyze
pre_normalize
post_normalize
pre_convert
post_convert
default
Specifies where in the transcode process source pipeline filters will be applied.
default: default

transcode.targets.video.filters.print_timecode

Example Print_timecode Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "print_timecode",
              "payload": {
                "x": "(w-lw)/2",
                "y": "h/4",
                "font_size": 20,
                "source_timecode_selector": "gop"
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
x integer
string
The x location to start the imprint. Can use expressions such as w-20 (w: width of the video).
default: 25
y integer
string
The y location to start the imprint. Can use expressions such as h-20 (h: height of the video).
default: 25
font string The font descriptor (compliant with fontconfig). Examples: 'Sans', 'URW Bookman L:style=Demi Bold Italic'.
font_size integer
The font size in points. A font size of 16 is the default.
default: 16
font_color string
See https://www.ffmpeg.org/ffmpeg-utils.html#Color for valid definitions. Example: blue: opaque blue, green@0.8: green with 0.8 alpha.
default: blue
background_color string
See https://www.ffmpeg.org/ffmpeg-utils.html#Color for valid definitions. Example: blue: opaque blue, green@0.8: green with 0.8 alpha.
border_size integer
Size of a border being drawn in background color.
default: 0
timecode_kind enum                          
timecode_auto
timecode_drop
timecode_nodrop
frame_nr
media_time
Choose the time/timecode format. If timecode_auto is used, drop/non-drop is chosen based on the frame rate.
default: timecode_auto
timecode_source enum                          
auto
start_value
media
Select the timecode source for imprinting.
default: media
source_timecode_selector enum                          
first
highest
lowest
mxf
gop
sdti
smpte
material_package
source_package
Specifies the metadata track to be used for time code data. [DESC]
default: first
timecode_start_value string
Start time. The units depends on the kind. Only valid for timecode_source=start_value.
lut_file object Allows referencing hosted LUT files, for example for SDR imprint into HDR video.
lut_preset enum                          
r709_to_r2020_pq_300nit
The LUT preset selection.

transcode.targets.video.filters.print_subtitle

Example Print_subtitle Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "print_subtitle",
              "payload": {
                "x_offset": "10%",
                "y_offset": "20%",
                "standard": "ttml",
                "language": "french"
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
x_offset integer
string
The x offset. Can use expressions such as N%. If it is a number without further units, it will be considered as pixels.
y_offset integer
string
The y offset. Can use expressions such as N%. If it is a number without further units, it will be considered as pixels.
time_offset_sec number
Specify a time offset (in seconds) in either direction.
category enum                          
default
forced
sdh
Optional: specify which source subtitle shall be rendered if multiple exist in the source asset.
language string
Optional: specify which source subtitle shall be rendered if multiple exist in the source asset.
font_size string
Optional: specify the font size to use when rendering subtitles. Usage of this setting means any font size settings that may already exist in the source subtitle are ignored.
background_color string
Optional: specify the background color to use when rendering subtitles. Usage of this setting means any background color settings that may already exist in the source subtitle are ignored.
imprint_style enum                          
auto
closed_caption
subtitle
ttml
The type of subtitle imprint to use.
font_files array Allows referencing hosted font files, with an optional language specifier.
lut_file object Allows referencing hosted LUT files, for example for SDR imprint into HDR video.
lut_preset enum                          
r709_to_r2020_pq_300nit
The LUT preset selection.
is_optional boolean if set to true, the transcode will not fail if this media type did not exist in the source or no source subtitle could be located.

Crop

transcode.targets.video.filters.crop

Example Crop Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "crop",
              "payload": {
                "top": 20,
                "bottom": 20,
                "use_source_dimensions": true
              }
            }
          ]
        },
        "audio": [
        ]
      }
    ]
  }
}
Name Type Description
left number
string
Number of pixels to crop from left. If '%' is appended, percentage of video width. "auto" uses the value reported from an up-stream black_borders analyzer task.
right number
string
Number of pixels to crop from right. If '%' is appended, percentage of video width. "auto" uses the value reported from an up-stream black_borders analyzer task.
top number
string
Number of pixels to crop from top. If '%' is appended, percentage of video height. "auto" uses the value reported from an up-stream black_borders analyzer task.
bottom number
string
Number of pixels to crop from bottom. If '%' is appended, percentage of video height. "auto" uses the value reported from an up-stream black_borders analyzer task.
use_source_dimensions boolean
The default behavior for cropping is to use the output target dimensions. The source dimensions may be used by setting this flag.

Ffmpeg

transcode.targets.video.filters.ffmpeg

Example Ffmpeg Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "ffmpeg",
              "payload": {
                "ffmpeg_filter": "yadif=mode=0:deint=0"
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
support_files array Allows referencing support files (e.g. hosted font files).
ffmpeg_filter string
This allows for a custom FFmpeg video filter string. See https://ffmpeg.org/ffmpeg-filters.html

Gaussian_blur

transcode.targets.video.filters.gaussian_blur

Example Gaussian_blur Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "gaussian_blur",
              "payload": {
                "radius": "10"
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
radius number
Gaussian Blur Radius (in pixels).

Image_overlay

transcode.targets.video.filters.image_overlay

Example Image_overlay Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "image_overlay",
              "payload": {
                "image_file": {
                  "storage_provider": "s3",
                  "url": "s3://my_bucket/my_input_folder/my_file.png"
                },
                "x": "overlay_w - 20",
                "y": 0,
                "height": "source_h",
                "opacity": 0.75,
                "start_sec": 5,
                "fadein_duration_sec": 10,
                "duration_sec": 30
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
image_file object Defines the location of the image file to be used.
x integer
string
X position of the overlay. Can use expressions such as overlay_w-20 (overlay_w: width of the overlay).
default: 25
y integer
string
Y position of the overlay. Can use expressions such as overlay_h-20 (overlay_h: height of the overlay).
default: 25
width number
string
Width of the overlay. Can use expressions such as source_w (width of the video source).
height number
string
Height of the overlay. Can use expressions such as source_h (height of the video source).
opacity number
Opacity of the overlay image. 0 = fully transparent, 1= fully opaque.
default: 1
start_sec number
Start point (in seconds) of the overlay.
fadein_duration_sec number
Fade-in time (in seconds) of the overlay.
duration_sec number
Duration (in seconds) of the overlay.
fadeout_duration_sec number
Fade-out time (in seconds) of the overlay.

Video_overlay

transcode.targets.video.filters.video_overlay

Example Video_overlay Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "video_overlay",
              "payload": {
                "video_file": {
                  "storage_provider": "s3",
                  "url": "s3://my_bucket/my_input_folder/my_file.mp4"
                },
                "x": "overlay_w - 20",
                "y": 0,
                "height": "source_h / 2",
                "opacity": 0.75,
                "start_sec": 5,
                "fadein_duration_sec": 10,
                "duration_sec": 30
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
video_file object The location of the video file to be overlaid on the output target.
x integer
string
X position of the overlay. Can use expressions such as overlay_w-20 (overlay_w: width of the overlay).
default: 25
y integer
string
Y position of the overlay. Can use expressions such as overlay_h-20 (overlay_h: height of the overlay).
default: 25
width number
string
Width of the overlay. Can use expressions such as source_w (width of the video source).
height number
string
Height of the overlay. Can use expressions such as source_h (height of the video source).
opacity number
Opacity of the overlay image. 0 = fully transparent, 1= fully opaque.
default: 1
start_sec number
Start point (in seconds) of the overlay.
fadein_duration_sec number
Fade-in time (in seconds) of the overlay.
duration_sec number
Duration (in seconds) of the overlay.
fadeout_duration_sec number
Fade-out time (in seconds) of the overlay.
repeat_count integer
Repeat count, 0: infinite.

Telecine

transcode.targets.video.filters.telecine

Example Telecine Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "telecine",
              "payload": {
                "interlace_mode": "tff",
                "pattern": "23"
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
interlace_mode enum                          
tff
bff
The interlacing mode. tff: top field first. bff: bottom field first.
default: tff
pattern enum                          
22
23
2332
222222222223
Specify the desired cadence pattern, 2332 is the default. 22 is a special case, causing interlacing without a frame rate change.
default: 2332

Deinterlace

transcode.targets.video.filters.deinterlace

Example Deinterlace Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "kind": "deinterlace",
          "payload": {
            "interlace_mode": "auto",
            "motion_compensation": true,
            "motion_compensation_quality": "high"
          }
        },
        "audio": [
        ]
      }
    ]
  }
}
Name Type Description
interlace_mode enum                          
tff
bff
auto
frame_metadata
The source interlacing mode - auto means auto-detect.
default: auto
motion_compensation boolean Use a motion-compensated deinterlacer. Quality is better but CPU use will be significantly higher.
motion_compensation_quality enum                          
low
medium
high
veryhigh
Quality settings for the motion-compensated deinterlacer.

Fade

transcode.targets.video.filters.fade

Example Fade Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "fade",
              "payload": {
                "mode": "in",
                "start_sec": 0,
                "duration_sec": 3
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
mode enum                          
in
out
The fade mode, in or out.
default: in
start_sec number Fade start time in seconds.
duration_sec number Fade duration in seconds.

Color_convert

transcode.targets.video.filters.color_convert

Example Color_convert Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "color_convert",
              "payload": {
                "from": {
                  "ire_range_mode": "full",
                  "color_primaries": "bt2020",
                  "color_trc": "hlg",
                  "color_matrix": "bt2020c"
                },
                "to": {
                  "ire_range_mode": "limited",
                  "color_primaries": "bt709",
                  "color_trc": "gamma28",
                  "color_matrix": "bt709"
                },
                "nominal_peak_luminance": 1000
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
from object The color description for the source.
to object The color description for the target.
nominal_peak_luminance number The nominal peak luminance to be used during color format conversion.
preset enum                          
hdr_hlg_to_sdr
hdr_hlg_to_sdr_desat_mild
hdr_hlg_to_sdr_desat_medium
hdr_pq_to_sdr
hdr_pq_to_sdr_desat_mild
hdr_pq_to_sdr_desat_medium
Common presets used for converting between color spaces.
lut_file object The LUT file to be used during the color conversion.

From

transcode.targets.video.filters.color_convert.from

Example From Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "color_convert",
              "payload": {
                "from": {
                  "ire_range_mode": "full",
                  "color_primaries": "bt2020",
                  "color_trc": "hlg",
                  "color_matrix": "bt2020c"
                },
                "to": {
                  "ire_range_mode": "limited",
                  "color_primaries": "bt709",
                  "color_trc": "gamma28",
                  "color_matrix": "bt709",
                },
                "nominal_peak_luminance": 1000
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
ire_range_mode enum                          
auto
full
limited
Chroma coordinate reference of the primaries.
color_primaries enum                          
bt601
bt709
bt470m
bt470bg
smpte170m
smpte240m
smpte431
smpte432
bt2020
film
Chroma coordinate reference of the primaries. The default is determined by video size.
color_trc enum                          
bt601
bt709
st2084
gamma22
gamma28
smpte170m
smpte240m
linear
log
log sqrt
bt1361 ecg
iec61966 2.1
iec61966 2.4
bt2020_10bit
bt2020_12bit
hlg
arib_stdb67
Color transfer characteristics. The default is determined by video size.
color_matrix enum                          
rgb
bt470bg
bt601
bt709
smpte170m
smpte240m
bt2020c
bt2020nc
smpte2085
YUV/YCbCr colorspace type.

To

transcode.targets.video.filters.color_convert.to

Example To Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "filters": [
            {
              "kind": "color_convert",
              "payload": {
                "from": {
                  "ire_range_mode": "full",
                  "color_primaries": "bt2020",
                  "color_trc": "hlg",
                  "color_matrix": "bt2020c"
                },
                "to": {
                  "ire_range_mode": "limited",
                  "color_primaries": "bt709",
                  "color_trc": "gamma28",
                  "color_matrix": "bt709",
                },
                "nominal_peak_luminance": 1000
              }
            }
          ]
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
ire_range_mode enum                          
auto
full
limited
Chroma coordinate reference of the primaries.
color_primaries enum                          
bt601
bt709
bt470m
bt470bg
smpte170m
smpte240m
smpte431
smpte432
bt2020
film
Chroma coordinate reference of the primaries. The default is determined by video size.
color_trc enum                          
bt601
bt709
st2084
gamma22
gamma28
smpte170m
smpte240m
linear
log
log sqrt
bt1361 ecg
iec61966 2.1
iec61966 2.4
bt2020_10bit
bt2020_12bit
hlg
arib_stdb67
Color transfer characteristics. The default is determined by video size.
color_matrix enum                          
rgb
bt470bg
bt601
bt709
smpte170m
smpte240m
bt2020c
bt2020nc
smpte2085
YUV/YCbCr colorspace type.

Scaler

transcode.targets.video.scaler

Example Scaler Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264",
          "width": 1920,
          "height": 1080,
          "bitrate_kb": 6000,
          "max_bitrate_kb": 8000,
          "bitrate_mode": "vbr",
          "scaler": {
            "kind": "zscale",
            "config_string": "dither=error_diffusion",
            "apply_always": true
          }
        },
        "audio": [
          {
          }
        ]
      }
    ]
  }
}
Name Type Description
kind enum                          
default
zscale
The type of scaling to be applied.
default: default
config_string string The configuration string to be used with the specified scaling function.
apply_always boolean Always use the specified scaling function.

Audio

Example Audio Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mpegts"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "ac3",
            "pid": 482,
            "channels": 6,
            "sample_rate": 48000,
            "bitrate_kb": 384
          },
          {
            "codec": "aac_lc",
            "pid": 483,
            "channels": 2,
            "sample_rate": 48000,
            "bitrate_kb": 128
          }
        ]
      }
    ]
  }
}

Audio in Hybrik is represented by a array structure. The arrays are zero-based, meaning that the first audio track is represented by audio[0]. So, referencing the 3rd channel of the second track of audio would be: audio[1], channel[2].

transcode.targets.audio

Name Type Description
include_if_source_has array This array allows for conditionally outputting tracks based on whether or not a specific input track exists. The tracks in the source are referred to by number reference: audio[0] refers to the first audio track.
include_conditions array This array allows for conditionally including output audio tracks based on conditions in the input file.
verify boolean Enable or disable post transcode verification for this track.
default: true
codec enum                          
copy
aac
mpeg2_aac
mpeg4_aac
aac_lc
heaac_v1
heaac_v2
heaac_auto
s302m
mp2
pcm
mp3
ac3
aiff
alac
flac
eac3
vorbis
opus
dolby_digital
dolby_digital_plus
The audio codec to use. Selecting 'copy' will attempt to use the compressed source audio stream.
codec_provider enum                          
default
ffmpeg
The codec provider to be used for encoding.
pid integer
The audio program ID. This is only used for MPEG transport streams.
maximum: 8190
channels integer
The number of audio channels.
minimum: 1
maximum: 16
dolby_digital_plus object The parameters for Dolby Digital Plus encoding.
sample_size enum                          
16
24
32
The audio sample size in bits.
default: 24
sample_format enum                          
pcm_f16le
pcm_f24le
pcm_f32le
pcm_f16be
pcm_f24be
pcm_f32be
pcm_s16le
pcm_s24le
pcm_s32le
pcm_s16be
pcm_s24be
pcm_s32be
pcm_u16le
pcm_u24le
pcm_u32le
pcm_u16be
pcm_u24be
pcm_u32be
The audio sample format/description.
sample_rate integer
The audio sample rate in Hz. Typical values are 44100 and 48000. Omit to use the source sample rate.
bitrate_mode enum                          
cbr
vbr
Select between constant and variable bitrate encoding. Note that not all codecs support all bitrate modes. Omit this value to use the codec's default.
bitrate_kb number
The audio bitrate in kilobits per second. This is the average bitrate in the case of vbr. Not all audio codecs support this setting. Omit to use codec's default.
minimum: 1
maximum: 1024
min_bitrate_kb number
The minimum audio bitrate in kilobits per second. Valid for vbr only.
minimum: 1
maximum: 1024
max_bitrate_kb number
The maximum audio bitrate in kilobits per second. Valid for vbr only.
minimum: 1
maximum: 1024
language string
The audio language code. ISO-639 notation is preferred, but Hybrik will attempt to convert the passed language identifier.
disposition enum                          
default
dub
original
comment
lyrics
karaoke
The audio disposition.
track_name string
The name of this audio track - will be used for mov files and MPEG-DASH (representation::id) for example. May be ignored, depending on your container format.
track_group_id string This indicates which Group this track belongs to. Multiple tracks with the same content but different bitrates would have the same track_group_id.
layer_id string This indicates which Layer this tracks belongs to. For example, this allows bundling one video layer and multiple audio layers with same bitrates but different languages.
layer_affinities array This indicates which other layers this layer can be combined with. For example, to combine audio and video layers.
filters array An array of audio filters that will be applied in order to the output audio.
convert_aac_headers enum                          
disabled
adts_to_asc
For solving aac transmux issues between mp4 and ts/raw tracks.
dialnorm number
string
Dialogue Level (aka dialogue normalization or dialnorm) is the average dialogue level of a program over time, measured with an LAEq meter, referenced to 0 dBFS.
mainconcept_stream_mux_options string
Provide direct stream instruction to the MainConcept multiplexer. Values are constructed as "prop=val,prop=val". See MainConcept documentation for valid values.
pcm_wrapping enum                          
raw
bwf
aes
The type of wrapping to use for PCM audio tracks.
ffmpeg_args string
The FFmpeg (target) command line arguments to be used. Note that these will override competing settings in the JSON.

Dolby Digital Plus

transcode.targets.audio.dolby_digital_plus

Example Dolby_digital_plus Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "dolby_digital_plus",
            "channels": 2,
            "sample_rate": 48000,
            "dolby_digital_plus": {
              "bitstream_mode": "complete_main",
              "surround_attenuation_3_db": true
            }
          }
        ]
      }
    ]
  }
}
Name Type Description
constraints enum                          
broadcast_atsc
broadcast_ebu
streaming
bluray
The target output constraint to use for Dolby Digital encoding.
audio_coding_mode enum                          
3/2
3/2L
3/1
3/1L
3/0
3/0L
2/2
2/1
2/0
1/0
Defines the number of full bandwidth audio channels being encoded.
bitstream_mode enum                          
complete_main
music_and_effects
visually_impaired
hearing_impaired
dialogue
commentary
emergency
voice
Type of audio bitstream being processed. 'complete_main' and 'music_and_effects' are main audio services; the rest are associated audio services.
dialnorm number Dialogue Level (aka dialogue normalization or dialnorm) is the average dialogue level of a program over time, measured with an LAEq meter, referenced to 0 dBFS. This is the equivalent of loudness_target in DPLC, with a range of -31dB .. -1dB)
minimum: -31
maximum: -1
phase_shift_90_degree boolean A 90º phase shift can be applied to the surround channels during encoding. This is useful for generating multichannel bitstreams which, when downmixed, can create a true Dolby Surround compatible output (Lt/Rt). Default setting is true.
dolby_surround_mode enum                          
not_indicated
enabled
disabled
Indicates to a Dolby Digital decoding product whether the two-channel encoded bitstream requires Pro Logic decoding.
dolby_digital_surround_ex_mode enum                          
not_indicated
enabled
disabled
This parameter is used to identify the encoded audio as material encoded in Surround EX. This parameter is only used if the encoded audio has two Surround channels.
surround_attenuation_3_db boolean –3 dB attenuation can be used to reduce the levels of the surround channels to compensate between the calibration of film dubbing stages and consumer replay environments. The surround channels in film studios are set 3 dB lower than the front channels (unlike consumer applications of 5.1), leading to the level on tape being 3 dB higher. Apply the 3 dB attenuation when using a master mixed in a film room.
dynamic_range_control object Dynamic Range Presets allow the user to select the compression characteristic that is applied to the Dolby Digital bitstream during decoding. These compression presets aid playback in less-than-ideal listening environments.
stereo_downmix_preference object The settings for the final downmix.

Dynamic_range_control

transcode.targets.audio.dolby_digital_plus.dynamic_range_control

Example Dynamic_range_control Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "dolby_digital_plus",
            "channels": 2,
            "sample_rate": 48000,
            "dolby_digital": {
              "bitstream_mode": "complete_main",
              "surround_attenuation_3_db": true,
              "dynamic_range_control": {
                "line_mode_profile": "speech"
              }
            }
          }
        ]
      }
    ]
  }
}
Name Type Description
line_mode_profile enum                          
none
film_std
film_light
music_std
music_light
speech
Settings for line compression mode.
rf_mode_profile enum                          
none
film_std
film_light
music_std
music_light
speech
Settings for RF compression mode.

Stereo_downmix_preference

transcode.targets.audio.dolby_digital_plus.stereo_downmix_preference

Example Stereo_downmix_preference Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_output{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "dolby_digital_plus",
            "channels": 2,
            "sample_rate": 48000,
            "dolby_digital_plus": {
              "bitstream_mode": "complete_main",
              "surround_attenuation_3_db": true,
              "stereo_down_mix_preference": {
                "loro_center_mix_level": -3,
                "loro_surround_mix_level": -3,
                "ltrt_center_mix_level": -3,
                "ltrt_surround_mix_level": -3,
                "preferred_downmix_mode": "loro"
              }
            }
          }
        ]
      }
    ]
  }
}
Name Type Description
mode enum                          
not_indicated
lo/ro
lt/rt
pro_logic_2
The mode for the downmix.
ltrt_center_mix_level number
string
The LtRt Center mix level.
ltrt_sur_mix_level number
string
The LtRt Surround mix level
loro_center_mix_level number
string
The LoRo Center mix level.
loro_sur_mix_level number
string
The LoRo Center mix level.

Audio Filters

Example Audio Filters Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "bitrate_kb": 256,
            "filters": [
              {
                "kind": "normalize",
                "payload": {
                  "kind": "ebur128",
                  "payload": {
                    "allow_unprecise_mode": true,
                    "integrated_lufs": -16,
                    "true_peak_dbfs": -3
                  }
                }
              }
            ]
          },
          {
            "codec": "pcm",
            "channels": 2,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 512,
            "filters": [
              {
                "kind": "fade",
                "payload": {
                  "mode": "in",
                  "start_sec": 0,
                  "duration_sec": 3
                }
              }
            ]
          }
        ]
      }
    ]
  }
}

Hybrik supports a number of audio filters, including controlling levels, normalization, and fading. Hybrik also suppors the application of standard FFmpeg audio filters.

transcode.targets.audio.filters

Name Type Description
kind enum                          
ffmpeg
level
normalize
fade
The type of audio filter being applied.
default: ffmpeg
options object The options for where in the transcoding pipeline the filter will be applied.
include_conditions array Specifies conditions under which this filter will be applied. Can use Javascript math.js nomenclature
payload anyOf         
ffmpeg
level
normalize
fade
The audio filter payload.

Options

transcode.targets.audio.filters.options

Example Options Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "bitrate_kb": 256,
            "filters": [
              {
                "kind": "level",
                "options": {
                  "position": "pre_normalize"
                },
                "payload": {
                  "kind": "ebur128",
                  "payload": {
                    "factor": 1.5
                  }
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
position enum                          
pre_analyze
pre_normalize
post_normalize
pre_convert
post_convert
default
Specifies where in the transcode process source pipeline filters will be applied.
default: default

Ffmpeg

transcode.targets.audio.filters.ffmpeg

Example Ffmpeg Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 2,
            "sample_rate": 48000,
            "filters": [
              {
                "kind": "ffmpeg",
                "payload": {
                  "ffmpeg_filter": "adelay=1500|0|500"
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
support_files array Support files referenced inside of filter descriptions.
ffmpeg_filter string
Set the arguments for custom FFmpeg audio filter processing. See https://ffmpeg.org/ffmpeg-filters.html

Level

transcode.targets.audio.filters.level

Example Level Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "filters": [
              {
                "kind": "level",
                "payload": {
                  "factor": 1.5
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
factor number Multiplication factor. May lead to clipping.

Normalize

transcode.targets.audio.filters.normalize

Example Normalize Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "filters": [
              {
                "kind": "normalize",
                "payload": {
                  "kind": "peak",
                  "payload": {
                    "peak_level_db": -3
                  }
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
kind enum                          
ebur128
peak
rms
dynamic
dolby_professional_loudness
All methods except dynamic will either use analyze values supplied here, or force 2-pass encoding to determine accurate levels for adjusting them in a later pass.
default: ebur128
payload anyOf         
ebur128
peak
rms
dynamic
dolby_professional_loudness
The payload for the audio normalize filter.

Ebur128

transcode.targets.audio.filters.normalize.ebur128

Example Ebur128 Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "filters": [
              {
                "kind": "normalize",
                "payload": {
                  "kind": "ebur128",
                  "payload": {
                    "integrated_lufs": -16,
                    "true_peak_dbfs": -3,
                    "allow_unprecise_mode": false
                  }
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
source object EBU R.128 substitute analysis results for cases where the analysis was run separately.
integrated_lufs number LUFS = Loudness Units Full Scale. The Integrated value means the loudness integrated over the entire length of the program. European TV applications have a recommended level of -23 LUFS. Web services like iTunes and YouTube have targets of -16 and -14 respectively.
loudness_lra_lufs number LRA = Loudness Range. This quantifies the statistical distribution of short-term loudness within a program. A low LRA (-1 to -3) indicates material with a narrow dynamic range.
true_peak_dbfs number True Peak indicates whether there is maximum intersample peaking.
allow_unprecise_mode boolean Using unprecise mode allows for normalization without running an analysis first. As you might guess, this is less precise than running an EBU.R128 analysis as part of an Analyzer Task first.
analyzer_track_index integer This specifies which analyzer track data to use for this filter.
is_optional boolean if set to true, the transcode will not fail if this media type did not exist in the source.

Peak

transcode.targets.audio.filters.normalize.peak

Example Peak Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "filters": [
              {
                "kind": "normalize",
                "payload": {
                  "kind": "peak",
                  "payload": {
                    "peak_level_db": -3
                  }
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
source object Normalization analysis results for cases where the analysis was run separately.
peak_level_db number Audio peak level in decibels (dB).
analyzer_track_index integer Limits analyzer to run on a particular track.
is_optional boolean if set to true, the transcode will not fail if this media type did not exist in the source.

Rms

transcode.targets.audio.filters.normalize.rms

Example Rms Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "filters": [
              {
                "kind": "normalize",
                "payload": {
                  "kind": "rms",
                  "payload": {
                    "rms_level_db": -3
                  }
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
source object Normalization analysis results for cases where the analysis was run separately.
rms_level_db number The RMS level in dB.
analyzer_track_index integer This specifies which audio track to analyze.
is_optional boolean If set to true, the transcode will not fail if this media type did not exist in the source.

Dynamic

transcode.targets.audio.filters.normalize.dynamic

Example Dynamic Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "filters": [
              {
                "kind": "normalize",
                "payload": {
                  "kind": "dynamic",
                  "payload": {
                    "peak_level_db": -3,
                    "window_size_samples": 256
                  }
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
peak_level_db number The Peak Level in dB.
rms_level_db number The RMS level in dB.
window_size_samples integer The window size (in samples) to use for the RMS calculation.

Dolby_professional_loudness

transcode.targets.audio.filters.normalize.dolby_professional_loudness

Example Dolby_professional_loudness Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "ac3",
            "channels": 6,
            "filters": [
              {
                "kind": "normalize",
                "payload": {
                  "kind": "dolby_professional_loudness",
                  "payload": {
                    "regulation_type": "atsc_a85_fixed",
                    "integrated_lufs": -16,
                    "dialogue_intelligence": true
                  }
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
is_optional boolean If set to true, the transcode will not fail if this media type did not exist in the source.
correction_mode enum                          
pcm_normalization
metadata_update
The Dolby Professional Loudness Correction mode.
use_dialogue_intelligence boolean Dolby Dialogue Intelligence enabled.
regulation_type enum                          
atsc_a85_fixed
atsc_a85_agile
ebu_r128
freetv_op59
arib_tr_b32
manual
The type of regulation to use for Dolby Professional Loudness Correction.
loudness_target number The loudness LKFS target (-31..-1 dB). This is the equivalent of dialnorm in the Dolby Digital encoder.
minimum: -31
maximum: -1
speech_detection_threshold integer The speech detection threshold (0..100, increments of 1).
maximum: 100
limit_mode enum                          
true_peak
sample_peak
Specifies whether true peak or sample peak is used as the basis for leveling.
peak_limit_db number The peak value in dB to use for loudness correction, -8 to -0.1 dBTP (in increments of 0.1 dBTP).
minimum: -8
maximum: -0.1

Fade

transcode.targets.audio.filters.fade

Example Fade Object

{
  "uid": "transcode_media",
  "kind": "transcode",
  "payload": {
    "location": {
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted.mp4",
        "container": {
          "kind": "mp4"
        },
        "video": {
        },
        "audio": [
          {
            "codec": "heaac_v2",
            "channels": 4,
            "sample_rate": 48000,
            "filters": [
              {
                "kind": "fade",
                "payload": {
                  "mode": "in",
                  "start_sec": 0,
                  "duration_sec": 3
                }
              }
            ]
          }
        ]
      }
    ]
  }
}
Name Type Description
mode enum                          
in
out
The fade mode, in or out.
default: in
start_sec number Fade start time in seconds.
duration_sec number Fade duration in seconds.
curve enum                          
sinus
linear
The fade curve type.
default: sinus

Timecode

transcode.targets.timecode

Example Timecode Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted{default_extension}",
        "existing_files": "replace",
        "timecode": {
          "source": "start_value",
          "start_value": "01:00:00;00"
        },
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264",
          "frame_rate": "30000/1001"
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 96
          }
        ]
      }
    ]
  }
}
Name Type Description
include_if_source_has array This array allows for conditionally outputting tracks based on whether or not a specific input track exists. The tracks in the source are referred to by number reference: timecode[0] refers to the first timecode track.
include_conditions array An array defining the include conditions for this time code track.
verify boolean Enable or disable post transcode verification for this track.
default: true
source enum                          
auto
start_value
media
The source to be used for time code data. A specific value can be forced by selecting start_value.
source_timecode_selector enum                          
first
highest
lowest
mxf
gop
sdti
smpte
material_package
source_package
Specifies the metadata track to be used for time code data. [DESC]
default: first
start_value string
Start time code, use hh:mm:ss:nr (non-drop) or hh:mm:ss;nr (drop).
timecode_frame_rate string
Start time code, use hh:mm:ss:nr (non-drop) or hh:mm:ss;nr (drop).
force_drop boolean
Forces time code interpretation to be drop-frame.

Subtitle

transcode.targets.subtitle

Example Subtitle Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 96
          }
        ],
        "subtitle": {
          "source_map": "use_if_exists",
          "format": "scc",
          "language": "en"
        }
      }
    ]
  }
}
Name Type Description
include_if_source_has array This array allows for conditionally outputting tracks based on whether or not a specific input tracks exists. The tracks in the source are referred to by number reference: subtitle[0] refers to the first subtitle track.
include_conditions array Specifies conditions under which this subtitle will be used. Can use Javascript math.js nomenclature
verify boolean Enable or disable post transcode verification for this track.
default: true
source_map enum                          
use_if_exists
source_or_empty
required_in_source
This specifies the behavior to use when creating a subtitle track. Selecting source_or_empty will use the source's subtitle data or create an empty track if this data does not exist.
default: required_in_source
format enum                          
webvtt
srt
stl
ass
ttml
imsc1
dvbsub
scc
timed_text
The subtitle format to use.
pid integer
The video program ID - only used for MPEG transport streams.
maximum: 8190
language string The ISO 639.2 three letter code for the language of the subtitle.
track_group_id string This indicates which Group this track belongs to. Multiple tracks with the same content but different bitrates would have the same track_group_id.
layer_id string This indicates which Layer this tracks belongs to. For example, this allows bundling one video layer and multiple audio layers with same bitrates but different languages.
layer_affinities array This indicates which other layers this layer can be combined with. For example, to combine audio and video layers.
match_source_language boolean If true, sets the output subtitle track language to be the same as the language of the subtitle track in the source.
dvb_options object The options for European Digital Video Broadcasting (DVB) subtitles.
webvtt_options object Options for the WebVTT output.
scc_options object Options for the SCC output.
mainconcept_stream_mux_options string
Provide direct stream instruction to the MainConcept multiplexer. Values are constructed as "prop=val,prop=val". See MainConcept documentation for valid values.

Dvb_options

transcode.targets.subtitle.dvb_options

Example Dvb_options Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 96
          }
        ],
        "subtitle": {
          "source_map": "use_if_exists",
          "format": "dvbsub",
          "language": "en",
          "dvb_options": {
            "page_id": 3,
            "use_full_width_regions": true,
            "video_start_timecode": "01:00:00:00",
            "width_overide": 1280,
            "height_overide": 720
          }
        }
      }
    ]
  }
}
Name Type Description
sd_in_hd boolean Positions SD subtitles correctly in an HD frame.
use_df_timecode boolean Uses drop-frame timecode timing for the subtitles.
page_id integer Specifies the page_id for the subtitle.
dvb_subtitling_type integer Specifies the type of DVB subtitling.
dvb_skip_dds boolean Skips a Display Definition Segment.
video_start_timecode string The start timecode for the subtitle.
color_depth_bits integer The bit depth for the subtitle.
use_region_fill_flag boolean Flag setting whether to fill the defined region. The fill is completed before any text is rendered.
use_full_width_regions boolean Flag setting whether the region should encompass the entire width of the screen. No two regions can be presented horizontally next to each other.
use_full_width_objects boolean Flag setting whether objects should encompass the entire width of the screen.
non_empty_pcs_on_hide boolean Flag to send non-empty Page Composition Segment on subtitle hide command.
send_empty_bitmap_on_hide boolean Flag to send empty bitmap on hide command.
use_transparent_color_0 boolean Flag to set color0 to be transparent.
width_overide integer Override the width of the subtitle with this value.
height_overide integer Override the height of the subtitle with this value.
dar_overide number Override the Display Aspect Ratio of the subtitle with this value.
font_height integer The height in pixels for the subtitle font.
outline_size integer The thickness of the outline to use around the subtitle font.
bold boolean Flag to set the font to bold.

Webvtt_options

transcode.targets.subtitle.webvtt_options

Example Webvtt_options Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 96
          }
        ],
        "subtitle": {
          "source_map": "use_if_exists",
          "format": "webvtt",
          "language": "en",
          "webvtt_options": {
            "cue_numbering": true
          }
        }
      }
    ]
  }
}
Name Type Description
cue_numbering boolean Flag to turn on cue numbering in the WebVTT output.

Scc_options

transcode.targets.subtitle.scc_options

Example Scc_options Object

{
  "uid": "transcode_task",
  "kind": "transcode",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder"
    },
    "targets": [
      {
        "file_pattern": "{source_basename}_converted{default_extension}",
        "existing_files": "replace",
        "container": {
          "kind": "mp4"
        },
        "video": {
          "width": 1280,
          "height": 720,
          "codec": "h264"
        },
        "audio": [
          {
            "codec": "aac",
            "channels": 2,
            "sample_rate": 48000,
            "sample_size": 16,
            "bitrate_kb": 96
          }
        ],
        "subtitle": {
          "source_map": "use_if_exists",
          "format": "webvtt",
          "language": "en",
          "scc_options": {
            "drop_frame": true
          }
        }
      }
    ]
  }
}
Name Type Description
drop_frame boolean Flag to toggle drop/non-drop time code in scc.

Analyze Task

Analyze

Example Analyze Object

{
  "name": "Hybrik Analyze Example",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_input_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "analyze_task",
        "kind": "analyze",
        "payload": {
          "general_properties": {
            "enabled": true
          },
          "deep_properties": {
            "audio": [
              {
                "volume": {
                  "enabled": true
                },
                "levels": {
                  "enabled": true
                }
              }
            ],
            "video": {
              "black": {
                "duration_sec": 5,
                "enabled": true
              },
              "black_borders": {
                "black_level": 0.08,
                "enabled": true
              },
              "interlacing": {
                "enabled": true
              },
              "levels": {
                "chroma_levels": true,
                "histograms": true,
                "enabled": true
              }
            }
          }
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "analyze_task"
            }
          ]
        }
      }
    ]
  }
}

One of the features built into Hybrik is to analyze the properties of a file. There are two types of analyis that can be performed -- general_properties and deep_properties. The general_properties include things like file size, video format, number of channels, etc. This metadata that can be analyzed very quickly without actually decoding the file. In contrast, the deep_properties required decoding the file in order to analyze individual video and audio streams. The types of things that can be analyzed in the deep_properties includes things like black_detection, silence_detection, PSNR, VMAF, etc. The analysis task returns information about the file, but does not make any "judgement" about whether the information represents a good or bad file. If you would like to have your workflow report errors or warnings based on the analysis, you can run a QC Task after the Analyze Task. The Analyze Task results are embedded in the job results. The results include a wide set of information depending on the type of analysis. Typically, the returned values include minimum, maximum, and mean values, as well as the frame location of min/max values. Additionally for analysis types that have time-varying results (like datarate), a set of 100 samples distributed over the duration of the file will be returned for graphing purposes.

analyze

Name Type Description
options object Options for the Analyze Task.
source_pipeline object Pipeline modifications (such as trimming) to apply prior to running the analysis task.
compare_asset object Compare asset to use for comparative analyzers.
general_properties object Metadata analysis of a file or stream. Does not require decoding of the asset
deep_properties object Object specifying which deep properties to analyze. Deep properties require decoding the asset.
reports array An object or array of objects describing the location and creation conditions for reports.

Options

Example Options Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "options": {
      "quick_scan": {
        "include_start": true,
        "include_end": true,
        "nr_of_slices": 10,
        "slice_duration_sec": 10
      }
    },
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "black_borders": {
          "black_level": 0.08,
          "enabled": true
        }
      }
    }
  }
}

analyze.options

Name Type Description
asset_db_cache boolean Enable analyze result database caching.
report_version integer Analyze report version, to preserve legacy elements for existing code parsing.
quick_scan object Analyzer QuickScan properties.

Quick_scan

analyze.options.quick_scan

Example Quick_scan Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "options": {
      "quick_scan": {
        "slice_duration_sec": 10,
        "slice_interval_sec": 100
      }
    },
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": {
      },
      "video": {
      }
    }
  }
}
Name Type Description
include_start boolean Forces the slices to include the start of the file.
include_end boolean Forces the slices to include the end of the file.
nr_of_slices integer Number of slices to be included in the scan.
slice_duration_sec number The duration in seconds of each slice.
slice_interval_sec number The interval between slices in seconds.
coverage_percent number The amount of the file to cover. 1 = 1%, 100 = 100%.
scan_intervals array An array specifying specific intervals to be scanned.

Source_pipeline

Example Source_pipeline Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "source_pipeline": {
      "trim": {
        "inpoint_sec": 60,
        "outpoint_sec": 120
      },
      "general_properties": {
        "enabled": true
      },
      "deep_properties": {
        "audio": {
          "ebur128": {
            "enabled": true,
            "scale_meter": 9
          }
        }
      }
    }
  }

analyze.source_pipeline

Name Type Description
trim anyOf         
by_sec_in_out
by_sec_in_dur
by_timecode
by_asset_timecode
by_frame_nr
by_section_nr
by_media_track
by_nothing
Object defining the type of trim operation to perform on an asset.
accelerated_prores boolean
Flag to use native, accelerated, Apple ProRes decoder. Default true.

Compare_asset

Example Compare_asset Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    }
  },
  "general_properties": {
    "enabled": true
  },
  "deep_properties": {
    "audio": {
      "levels": {
        "enabled": true
      }
    },
    "video": {
      "settings": {
        "comparative": {
          "size_selector": "config",
          "width": 1280,
          "height": 720
        }
      },
      "vmaf": {
        "enabled": true
      }
    }
  }
}

analyze.compare_asset

Name Type Description
kind enum                          
asset_url
asset_urls
asset_complex
The type of asset. asset_url is a single asset, asset_urls is an array of assets, and asset_complex is an asset assembled from multiple components.
payload anyOf         
asset_url
asset_urls
The asset description payload.

General_properties

Example General_properties Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": {
        "volume": {
          "enabled": true
        }
      },
      "video": {
        "black": {
          "duration_sec": 5,
          "enabled": true
        }
      }
    }
  }
}

analyze.general_properties

Name Type Description
enabled boolean Enable this analyze operation.
mov_atom_descriptor_style enum                          
none
condensed
by_track
full
"none": do not list atoms. "condensed": pick the most important atoms and list linearly with the belonging tracks. "by_track": show the full hierarchy but list along with tracks. "full": show the full file hierarchy in the asset element.

Deep_properties

There are a variety of deep_properties that can be analyzed. These properties are broken into audio and video sections. You may have multiple types of simultaneous audio and video analyzers running as part of a single task. You may also have different audio analyzers being performed on different audio tracks. In the same way that audio tracks are represented by an array, the audio analyzers consist of a corresponding array.

Example Deep_properties Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "levels": {
            "enabled": true
          },
          "volume": {
            "enabled": true
          },
          "ebur128": {
            "enabled": true
          },
          "silence": {
            "enabled": true,
            "noise_db": -60,
            "duration_sec": 1
          }
        }
      ],
      "video": {
        "black": {
          "enabled": true,
          "black_level": 0.03,
          "duration_sec": 1
        },
        "black_borders": {
          "enabled": true
        },
        "interlace_mode": {
          "enabled": true
        },
        "levels": {
          "enabled": true
        },
        "blockiness": {
          "algorithm": "hybm",
          "enabled": true,
          "violation_frame_window": 5,
          "violation_threshold": 0.2,
          "max_report_violations": 5
        }
      }
    }
  }
}

analyze.deep_properties

Name Type Description
audio array Audio tracks are represented by an array, where each element of the array refers to a track. The deep properties analysis is also an array of objects. Each object in the array consists of the analyses that will be performed on the corresponding audio track.
video object Object containing all of the video deep_properties analyses to be performed.

Audio

analyze.deep_properties.audio

Example Audio Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "track_selector": {
            "index": 2
          },
          "volume": {
            "enabled": true
          },
          "levels": {
            "enabled": true
          }
        }
      ],
      "video": {
        "black": {
          "duration_sec": 5,
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
track_selector object Mechanism to select a specific audio track.
levels object Performs a deep analysis of the audio track(s), including DC offset, RMS peak, level etc.
ebur128 object Performs a EBU R.128 loudness determination on the audio track(s).
dolby_professional_loudness object Performs Dolby loudness analysis on the audio track(s).
volume object Uses simple volume measurement. This is less precise than using deep_stats but of higher performance.
silence object Detect silent segments in the audio track(s).
psnr object Determine the PSNR value between an asset and a reference.
emergency_alert object Detect emergency alert signals in the audio track(s).
channel_compositions object Determine the Channel Composition in the audio track(s).
test_tone object Find a test tone.

Track_selector

analyze.deep_properties.audio.track_selector

Example Track_selector Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "track_selector": {
            "pid": 381
          },
          "volume": {
            "enabled": true
          },
          "levels": {
            "enabled": true
          }
        }
      ],
      "video": {
        "black": {
          "duration_sec": 5,
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
id integer The ID of the audio track to be analyzed.
pid integer The PID of the audio track to be analyzed.
index integer The index (0-based) of the audio track to be analyzed.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Levels

analyze.deep_properties.audio.levels

Example Levels Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "deep_properties": {
      "audio": [
        {
          "levels": {
            "enabled": true
          }
        }
      ]
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.
window_length_sec number Configures the RMS measurement window size.
minimum: 0.01
maximum: 10
default: 0.05

Ebur128

analyze.deep_properties.audio.ebur128

Example Ebur128 Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "ebur128": {
            "enabled": true,
            "scale_meter": 9
          }
        }
      ]
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.
scale_meter number EBU R.128 scale meter, common values are 9 and 18, default is 9.
minimum: 9
maximum: 18
default: 9
target_reference object The LUFS and true peak values to used for EBU R.128 normalization.

Target_reference

analyze.deep_properties.audio.ebur128.target_reference

Example Target_reference Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "ebur128": {
            "enabled": true,
            "scale_meter": 18,
            "target_reference": {
              "integrated_lufs": -16.45,
              "loudness_lra_lufs": -12.6,
              "true_peak_dbfs": -3.5
            }
          }
        }
      ]
    }
  }
}
Name Type Description
integrated_lufs number Reference integrated LUFS value.
loudness_lra_lufs number Reference LRA LUFS value.
true_peak_dbfs number Reference true peak DBFS value.

Dolby_professional_loudness

analyze.deep_properties.audio.dolby_professional_loudness

Example Dolby_professional_loudness Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "deep_properties": {
      "audio": [
        {
          "dolby_professional_loudness": {
            "enabled": true,
            "loudness_target": -24,
            "use_dialogue_intelligence": true,
            "regulation_type": "ebu_r128"
          }
        }
      ]
    }
  }
}

Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.
correction_mode enum                          
pcm_normalization
metadata_update
The Dolby Professional Loudness Correction mode.
use_dialogue_intelligence boolean Dolby Dialogue Intelligence enabled.
regulation_type enum                          
atsc_a85_fixed
atsc_a85_agile
ebu_r128
freetv_op59
arib_tr_b32
manual
The type of regulation to use for Dolby Professional Loudness Correction.
loudness_target number The loudness LKFS target (-31..-1 dB). This is the equivalent of dialnorm in the Dolby Digital encoder.
minimum: -31
maximum: -1
speech_detection_threshold integer The speech detection threshold (0..100, increments of 1).
maximum: 100
limit_mode enum                          
true_peak
sample_peak
Specifies whether true peak or sample peak is used as the basis for leveling.
peak_limit_db number The peak value in dB to use for loudness correction, -8 to -0.1 dBTP (in increments of 0.1 dBTP).
minimum: -8
maximum: -0.1

Volume

analyze.deep_properties.audio.volume

Example Volume Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "silence": {
            "enabled": true,
            "is_optional": true,
            "noise_db": -70,
            "duration_sec": 4
          }
        }
      ]
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Silence

analyze.deep_properties.audio.silence

Example Silence Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "silence": {
            "enabled": true,
            "noise_db": -60,
            "duration_sec": 1
          }
        }
      ]
    }
  }
}

Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.
duration_sec number Silence must exceed this duration for triggering detection.
minimum: 1
maximum: 3600
default: 10
noise_db number The audio level must be above this value for being detected as a true audio signal.
minimum: -90
default: -60

Psnr

analyze.deep_properties.audio.psnr

Example Psnr Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "compare_asset": {
        "kind": "asset_url",
        "payload": {
          "storage_provider": "s3",
          "url": "s3://my_bucket/my_comparison_master.mp4"
        }
      },
      "audio": [
        {
          "psnr": {
            "enabled": true
          }
        }
      ]
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.

Emergency_alert

analyze.deep_properties.audio.emergency_alert

Example Emergency_alert Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "emergency_alert": {
            "enabled": true,
            "is_optional": false
          }
        }
      ]
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Channel_compositions

analyze.deep_properties.audio.channel_compositions

Example Channel_compositions Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "reports": [
      {
        "create_condition": "always",
        "file_pattern": "{source_basename}_audio_report.pdf",
        "location": {
          "storage_provider": "s3",
          "url": "s3://my_bucket/my_analyze_folder"
        }
      }
    ],
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "channel_compositions": {
            "enabled": true
          }
        }
      ]
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.

Test_tone

analyze.deep_properties.audio.test_tone

Example Test_tone Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "test_tone": {
            "enabled": true
          }
        }
      ]
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.

Video

analyze.deep_properties.video

Example Video Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": {
        "volume": {
          "enabled": true
        }
      },
      "video": {
        "black": {
          "duration_sec": 5,
          "enabled": true
        },
        "black_borders": {
          "black_level": 0.08,
          "enabled": true
        },
        "interlacing": {
          "enabled": true
        },
        "levels": {
          "chroma_levels": true,
          "histograms": true,
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
track_selector object Mechanism to select a specific video track for analysis.
settings object Settings for the comparison file, such as filters to be applied prior to comparison.
black object Detect segments with black video.
black_borders object Detect cropping, such as letter- or pillarboxes.
interlacing object Detect interlacing properties of the video by scanning frames.
levels object Analyze the video and detect min/max Y,Cb,Cr etc.
blockiness object Detect compression block artifacts.
hdr_stats object Detect HDR signal levels.
complexity object Produces a measurement for how complex the content is over time.
content_variance object Produces a measurement for how much the content is changing over time.
scene_change_score object Detects scene changes probabilities.
pse object Detect Photo-Sensitive Epilepsy (PSE) artifacts.
compressed_stats object Determines compressed frame sizes etc.
compressed_quality object Determines for example PQ values of the underlying bitstream.
ssim object Determine the SSIM value between an asset and a reference file.
ms_ssim object Determine the MS-SSIM value between an asset and a reference file.
psnr object Determine the PSNR value between an asset and a reference file.
vmaf object Uses the Netflix Video Multi-Method Assessment Fusion (VMAF) methods to assess the quality of an asset compared with a reference file.

Track_selector

analyze.deep_properties.video.track_selector

Example Track_selector Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "track_selector": {
          "pid": 380
        },
        "black": {
          "duration_sec": 5,
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
id integer Track selector by stream ID.
pid integer Track selector by stream PID (transport streams only).
index integer Track selector by stream index.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.

Settings

analyze.deep_properties.video.settings

Example Settings Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    },
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "levels": {
          "enabled": true
        },
        "settings": {
          "comparative": {
            "width": 1280,
            "height": 720,
            "compare_filters": [
              {
                "kind": "crop",
                "payload": {
                  "top": 10,
                  "bottom": 10
                }
              }
            ]
          }
        }
      }
    }
  }
}
Name Type Description
comparative object Operations to be applied to the comparative file before the comparison is executed. For video filter settings, please see the transcoding section.

Comparative

analyze.deep_properties.video.settings.comparative

Example Comparative Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    }
  },
  "general_properties": {
    "enabled": true
  },
  "deep_properties": {
    "video": {
      "settings": {
        "size_selector": "compare_asset",
        "compare_filters": [
          {
            "kind": "crop",
            "payload": {
              "top": 10,
              "bottom": 10
            }
          }
        ],
        "ssim": {
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
size_selector enum                          
main_asset
compare_asset
reference_asset
config
When comparing two files, select which file's size (width x height) will be used as the reference size. Choose config to set custom scaling.
width integer Sets the custom width that both files will be scaled to prior to comparison.
height integer Sets the custom height that both files will be scaled to prior to comparison.
chroma_format_selector enum                          
main_asset
compare_asset
reference_asset
config
Select which file's chroma format will be used as the reference. Choose config to set custom values.
chroma_format enum                          
yuv411p
yuv420p
yuv422p
yuv420p10le
yuv422p10le
yuv444p10le
yuva444p10le
yuv420p12le
yuv422p12le
yuv444p12le
yuv420p16le
yuv422p16le
yuv444p16le
yuva444p16le
yuvj420p
yuvj422p
rgb24
rgb48be
rgba64be
rgb48le
rgba64le
Available chroma formats.
compare_filters array Filters to be applied to the reference file prior to the file comparison. See the Transcode section for filter definitions.

Black

analyze.deep_properties.video.black

Example Black Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "black": {
          "enabled": true,
          "black_level": 0.03,
          "duration_sec": 1
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.
duration_sec number Black video must exceed this duration for triggering detection.
minimum: 1
maximum: 3600
default: 10
black_level number The video signal level must be above this value for being detected as non-black.
maximum: 1
default: 0.1
black_pixel_ratio number Ratio of black vs.non black pixels so that a picture is classified as black. 1.0: all pixels must be black, 0.0: no pixels must be black.
maximum: 1
default: 0.98
ire_range_mode enum                          
auto
full
limited
Determines if the analyzer shall use the assumed IRE ranges from the file (auto), full (0..255 for 8 bit) or limited (16..235 for 8 bit).
default: auto

Black_borders

analyze.deep_properties.video.black_borders

Example Black_borders Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "black_borders": {
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.
black_level number The video signal level must be above this value for being detected as non-black.
maximum: 1
default: 0.1

Interlacing

analyze.deep_properties.video.interlacing

Example Interlacing Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "interlacing": {
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.

Levels

analyze.deep_properties.video.levels

Example Levels Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "levels": {
          "chroma_levels": true,
          "histograms": true,
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
st2084_levels boolean Report ST.2084 levels.
chroma_levels boolean Report chroma levels.
histograms boolean Produce level histograms.
is_optional boolean If set to true, the analyzer will not fail if this media type does not exist in the source.

Blockiness

analyze.deep_properties.video.blockiness

Example Blockiness Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "blockiness": {
          "enabled": true,
          "algorithm": "gbim"
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
algorithm enum                          
gbim
npbm
hybm
GBIM and NPBM are standard algorithms for blockiness analysis; HYBM is a Hybrik proprietary algorithm, less sensitive to false positives.
violation_frame_window number
Sets the minimum number of frames where the threshold is exceeded to count as a violation. Default is 1.
violation_threshold number
Sets the blockiness threshold to trigger a violation.
max_report_violations integer
Sets the maximum number of violations to be reported.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Hdr_stats

analyze.deep_properties.video.hdr_stats

Example Hdr_stats Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "hdr_stats": {
          "enabled": true,
          "ire_range_mode": "limited",
          "transfer_function": "hlg"
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
transfer_function enum                          
pq
hlg
Select between PQ1000 (pq) and HLG (hlg).
ire_range_mode enum                          
auto
full
limited
Determines if the analyzer shall use the assumed IRE ranges from the file (auto), full (0..255 for 8 bit) or limited (16..235 for 8 bit).
default: auto
max_luminance number Set an assumed max luminance, range 0..1.
min_luminance number Set an assumed minimum luminance, range 0..1.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Complexity

analyze.deep_properties.video.complexity

Example Complexity Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "complexity": {
          "enabled": true,
          "analysis_width": 426,
          "analysis_height": 240
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
analysis_width integer Scales the content to this width before calculating the complexity.
analysis_height integer Scales the content to this height before calculating the complexity.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Content_variance

analyze.deep_properties.video.content_variance

Example Content_variance Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "content_variance": {
          "enabled": true
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.
freeze_frames object Object containing parameters for freeze frame identification.

Freeze_frames

analyze.deep_properties.video.content_variance.freeze_frames

Example Freeze_frames Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "content_variance": {
          "enabled": true,
          "freeze_frames": {
            "threshold": .01,
            "min_duration_sec": 1
          }
        }
      }
    }
  }
}
Name Type Description
threshold number The threshold value for determining frame similarity.
min_duration_sec number The minimum duration (in seconds) for a frame to be considered a freeze frame.

Scene_change_score

analyze.deep_properties.video.scene_change_score

Example Scene_change_score Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "scene_change_score": {
          "enabled": true,
          "cutlist_threshold": 0.4
          }
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
thresholds array An array of thresholds (range 0.1 to 1.0) to trigger a scene change detection.
cutlist_threshold number Sets scene change threshold for cutlist generation (range 0.3 to 1.0).
minimum: 0.3
maximum: 1
is_optional boolean
If the track is not present, do not generate an error.

PSE

Hybrik supports the analysis of video for Photo-Sensitive Epilepsy triggers. The PSE analysis looks for video effects that might trigger epileptic seizures in people who are sensitive to particular visual stimuli. The test identifies luminance flashes and pattersn that exceed a prescribed amplitude and frequency limit. The Hybrik PSE analysis is approved by the Digital Production Partnership for use with UK broadcast content.

analyze.deep_properties.video.pse

Example PSE Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "pse": {
          "enabled": true
          }
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Compressed_stats

analyze.deep_properties.video.compressed_stats

Example Compressed_stats Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "compressed_stats": {
          "enabled": true,
          "window_size": 10
          }
        }
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
window_size number The window size in seconds to be used for the analysis.
is_optional boolean
If the track is not present, do not generate an error.

Compressed_quality

analyze.deep_properties.video.compressed_quality

Example Compressed_quality Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "video": {
        "compressed_quality": {
          "enabled": true
        }
      }
    }
  }
}
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.
block_rel_pq_delta_threshold number Relative PQ delta threshold.
block_abs_pq_delta_threshold number Absolute PQ delta threshold.
block_abs_pq_threshold number Absolute PQ threshold.

Ssim

analyze.deep_properties.video.ssim

Example Ssim Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    }
  },
  "general_properties": {
    "enabled": true
  },
  "deep_properties": {
    "video": {
      "ssim": {
        "enabled": true
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.
results_file object The location for the output SSIM analysis data.

Results_file

analyze.deep_properties.video.ssim.results_file

Example Results_file Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    }
  },
  "general_properties": {
    "enabled": true
  },
  "deep_properties": {
    "video": {
      "ssim": {
        "enabled": true,
        "results_file": {
          "file_pattern": "{source_basename}_ssim_analysis.txt",
          "location": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_folder"
          }
        }
      }
    }
  }
}
Name Type Description
location object The location for the output SSIM analysis data.
file_pattern string The file pattern for the SSIM analysis data.

Ms_ssim

analyze.deep_properties.video.ms_ssim

Example Ms_ssim Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    }
  },
  "general_properties": {
    "enabled": true
  },
  "deep_properties": {
    "video": {
      "ms_ssim": {
        "enabled": true
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Psnr

analyze.deep_properties.video.psnr

Example Psnr Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    }
  },
  "general_properties": {
    "enabled": true
  },
  "deep_properties": {
    "video": {
      "settings": {
        "comparative": {
          "size_selector": "main_asset"
        }
      },
      "psnr": {
        "enabled": true
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.
results_file object The location for the output PSNR analysis data.

Results_file

analyze.deep_properties.video.psnr.results_file

Example Results_file Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    }
  },
  "general_properties": {
    "enabled": true
  },
  "deep_properties": {
    "video": {
      "settings": {
        "comparative": {
          "size_selector": "main_asset"
        }
      },
      "psnr": {
        "enabled": true,
        "results_file": {
          "file_pattern": "{source_basename}_psnr_analysis.txt",
          "location": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_folder"
          }
        }
      }
    }
  }
}
Name Type Description
location object The file location for the PSNR analysis data.
file_pattern string The file pattern for the PSNR analysis data.

Vmaf

analyze.deep_properties.video.vmaf

Example Vmaf Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "compare_asset": {
      "kind": "asset_url",
      "payload": {
        "storage_provider": "s3",
        "url": "s3://my_bucket/my_reference_file.mov"
      }
    }
  },
  "deep_properties": {
    "video": {
      "settings": {
        "comparative": {
          "size_selector": "config",
          "width": 1280,
          "height": 720
        }
      },
      "vmaf": {
        "enabled": true,
        "use_phone_model": true
      }
    }
  }
}
Name Type Description
enabled boolean Enables this analysis type.
use_phone_model boolean Use VMAF phone model.
threads integer Manually specify number of VMAF threads to use.
results_file object The location for the output VMAF analysis data.
is_optional boolean If set to true, the analyzer will not fail if this media type did not exist in the source.

Reports

Example Reports Object

{
  "uid": "analyze_task",
  "kind": "analyze",
  "payload": {
    "reports": [
      {
        "create_condition": "always",
        "file_pattern": "{source_basename}_report.pdf",
        "location": {
          "storage_provider": "s3",
          "path": "s3://my_bucket/my_reports_folder"
        }
      }
    ],
    "general_properties": {
      "enabled": true
    },
    "deep_properties": {
      "audio": [
        {
          "volume": {
            "enabled": true
          },
          "levels": {
            "enabled": true
          }
        }
      ],
      "video": {
        "black_borders": {
          "black_level": 0.08,
          "enabled": true
        },
        "levels": {
          "chroma_levels": true,
          "histograms": true,
          "enabled": true
        }
      }
    }
  }
}

analyze.reports

Name Type Description
create_condition enum                          
always
on_failure
on_success
Determines whether to create a QC report when the QC task succeeds, fails, or always.
file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
location object Output location for the QC report

QC Task

QC

Example QC Object

{
  "name": "Hybrik QC Example",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "analyze_task",
        "kind": "analyze",
        "payload": {
          "general_properties": {
            "enabled": true
          },
          "deep_properties": {
            "audio": {
              "volume": {
                "enabled": true
              },
              "levels": {
                "enabled": true
              }
            },
            "video": {
              "black": {
                "duration_sec": 5,
                "enabled": true
              },
              "black_borders": {
                "black_level": 0.08,
                "enabled": true
              },
              "interlacing": {
                "enabled": true
              },
              "levels": {
                "chroma_levels": true,
                "histograms": true,
                "enabled": true
              }
            }
          }
        }
      },
      {
        "uid": "qc_task",
        "kind": "qc",
        "payload": {
          "report": {
            "file_pattern": "{source_basename}_qc_report.pdf",
            "location": {
              "storage_provider": "s3",
              "path": "s3://my_bucket/my_folder"
            }
          },
          "conditions": {
            "pass": [
              {
                "condition": "source.video.width >= 1920)",
                "message_pass": "Video is HD; actual value = {source.video.width}",
                "message_fail": "Video is not HD; actual value = {source.video.width}"
              }
            ],
            "warn": [
              {
                "condition": "(abs(source.video.bitrate_kb - 50000)<5000)",
                "message_fail": "Bitrate: WARNING video bitrate = 50Mbps; actual value = {source.video.bitrate_kb}"
              }
            ]
          }
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "analyze_task"
            }
          ]
        }
      },
      {
        "from": [
          {
            "element": "analyze_task"
          }
        ],
        "to": {
          "success": [
            {
              "element": "qc_task"
            }
          ]
        }
      }
    ]
  }
}

In Hybrik, the QC Task must be paired with a preceding Analyze Task. The Analyze Task will analyze the selected file and create a collection of metadata about the file. The QC Task sets pass, fail, and warn conditions based on the metadata collected by the Analyze Task. The conditions can use any parameter value generated by the Analyze task. Each condition can have a unique pass/fail message. The pass, fail, and warn settings are arrays of conditions. A pass condition will pass if the condition is true. A fail condition will generate a failure if the condition is true.

qc

Name Type Description
condition string This string gets evaluated to true or false. Standard mathematical and logical operators can be part of the condition. Additionally, functions from the math.js Javascript library may be used.
message_pass string The string to be reported when the condition evaluates as true. Note that when the condition is being used in a "fail" array, a "message_pass" actually means that the fail condition has been met.
message_fail string The string to be reported when the condition evaluates as false. Note that when the condition is being used in a "fail" array, a "message_fail" actually means that the fail condition has not been met.

Notify Task

Notify

Example Notify Object

{
  "name": "Hybrik Transcode Example",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_input_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "payload": {
          "location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_output_folder"
          },
          "targets": [
            {
              "file_pattern": "{source_basename}_converted.mp4",
              "existing_files": "replace",
              "container": {
                "kind": "mp4"
              },
              "video": {
                "width": 1280,
                "height": 720,
                "codec": "h264",
                "profile": "high",
                "level": "4.0",
                "frame_rate": 23.976
              },
              "audio": [
                {
                  "codec": "aac",
                  "channels": 2,
                  "sample_rate": 48000,
                  "sample_size": 16,
                  "bitrate_kb": 128
                }
              ]
            }
          ]
        }
      },
      {
        "uid": "notify_task",
        "kind": "notify",
        "payload": {
          "notify_method": "email",
          "email": {
            "recipients": "{account_owner_email}",
            "subject": "Job {job_id} has completed.",
            "body": "File {source_basename} was processed."
          }
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_task"
            }
          ]
        }
      },
      {
        "from": [
          {
            "element": "transcode_task"
          }
        ],
        "to": {
          "success": [
            {
              "element": "notify_task"
            }
          ]
        }
      }
    ]
  }
}

In a Hybrik Job, you may want to trigger some external event or notify an individual when something particular happens. For example, you may want a failed job to email someone in your QA department to notify them. This can be accomplished by adding a Notification Task to your workflow. Note that you can have multiple Notification Task within your workflow triggering at different points. The types of notifications include REST calls, emails, and Amazon AWS Simple Notification Service (SNS) and or Simple Queue Service (SQS) messaging. In addition to Notification Tasks, Hybrik also supports message Subscriptions. A Notification sends a one-time message at a specific point in the Workflow. A Subscription can provide on-going status reports (such as progress) on a Job. Please see the Hybrik Tutorial on Subscriptions for more information.

notify

Name Type Description
notify_method enum                          
email
rest
sns
sqs
Notification method for messaging.
default: rest
email object
rest object
sns object
sqs object

Email

Example Email Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "email",
    "email": {
      "recipients": "{account_owner_email}",
      "subject": "Job {job_id} has completed.",
      "body": "File {source_basename} was processed."
    }
  }
}

notify.email

Name Type Description
recipients string
Email recipients, comma separated. Use {account_owner_email} to send an email to the Hybrik account owner.
subject string
Email subject line. Placeholders such as {source_name} and {job_id} are supported.
body string
Email body text. Placeholders such as {source_name} and {job_id} are supported.

Rest

Example Rest Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "rest",
    "rest": {
      "url": "http://my_rest_entry1.example.com/1057729/complete",
      "method": "PUT"
    }
  }
}

notify.rest

Name Type Description
url string
The URL for the REST call. Should be in complete format such as "http://my_rest_entry.example.com"
default: http://
proxy object Optional REST proxy URL, if a proxy server is used.
method enum                          
GET
PUT
POST
DELETE
HTTP method to use.

Proxy

notify.rest.proxy

Example Proxy Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "rest",
    "rest": {
      "url": "http://my_rest_entry1.example.com/1057729/complete",
      "method": "PUT",
      "proxy": {
        "url": "http://www.proxy.example.com"
      }
    }
  }
}
Name Type Description
url string

Sns

Example Sns Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "sns",
    "sns": {
      "topic": "arn:aws:sns:us-east-1:34534534353453:my-sns",
      "user_payload": "my SNS notification: {job_id}"
    }
  }
}

notify.sns

Name Type Description
topic string Amazon Web Services SNS topic.
aws_access oneOf         
basic_aws_credentials
computing_group_ref
credentials_vault_ref
Amazon Web Services credentials to use for SNS messaging.
user_payload string
Payload to attach to the SNS message. Placeholders may be used like {source_name} and {job_id}.

Basic_aws_credentials

notify.sns.basic_aws_credentials

Example Basic_aws_credentials Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "sns",
    "sns": {
      "topic": "arn:aws:sns:us-east-1:34534534353453:my-sns",
      "user_payload": "my SNS notification: {job_id}",
      "aws_access": {
        "shared_key": "XXXXXXXXXXXXXXXXXXX",
        "secret_key": "12345678901234567890"
      }
    }
  }
}
Name Type Description
shared_key string
The AWS Key.
secret_key string
The AWS Secret.
session_token string
The AWS Session Token.
region string
The AWS region (optional).
max_cross_region_mb integer This sets the maximum amount of data (in MB) that can be transferred across regions. This is to avoid excessive inter-region transfer costs. Set this to -1 for unlimited transfers.

Computing_group_ref

notify.sns.computing_group_ref

Example Computing_group_ref Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "sns",
    "sns": {
      "topic": "arn:aws:sns:us-east-1:34534534353453:my-sns",
      "user_payload": "my SNS notification: {job_id}",
      "aws_access": {
        "computing_group_id": "495292",

      }
    }
  }
}
Name Type Description
computing_group_id string
Use the AWS credentials from the specified Computing Group ID
max_cross_region_mb integer This sets the maximum amount of data (in MB) that can be transferred across regions. This is to avoid excessive inter-region transfer costs. Set this to -1 for unlimited transfers.

Credentials_vault_ref

notify.sns.credentials_vault_ref

Example Credentials_vault_ref Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "sns",
    "sns": {
      "topic": "arn:aws:sns:us-east-1:34534534353453:my-sns",
      "user_payload": "my SNS notification: {job_id}",
      "aws_access": {
        "credentials_key": "my_AWS_vault_key",

      }
    }
  }
}
Name Type Description
credentials_key string
Use API Key to reference credentials inside the Hybrik Credentials Vault.
region string
AWS region, optional
max_cross_region_mb integer This sets the maximum amount of data (in MB) that can be transferred across regions. This is to avoid excessive inter-region transfer costs. Set this to -1 for unlimited transfers.

Sqs

Example Sqs Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "sqs",
    "sqs": {
      "name": "my_sqs_queue",
      "user_payload": "Job {job_id} has completed.",
      "aws_access": {
        "credentials_key": "my_aws_vault_key"
      }
    }
  }
}

notify.sqs

Name Type Description
name string Amazon Web Services SQS Queue name.
group_id string Amazon Web Services SQS MessageGroupId.
provide_deduplication boolean Send a deduplication ID.
aws_access oneOf         
basic_aws_credentials
computing_group_ref
credentials_vault_ref
Amazon Web Services credentials to use for SNS messaging.

Basic_aws_credentials

notify.sqs.basic_aws_credentials

Example Basic_aws_credentials Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "sqs",
    "sqs": {
      "name": "my_sqs_queue",
      "user_payload": "Job {job_id} has completed.",
      "aws_access": {
        "shared_key": "AWERDFSDFSAEFAWERF",
        "secret_key": "2345rsdfswrqw4rw4redfljq34r23rf1223as"
      }
    }
  }
}
Name Type Description
shared_key string
The AWS Key.
secret_key string
The AWS Secret.
session_token string
The AWS Session Token.
region string
The AWS region (optional).
max_cross_region_mb integer This sets the maximum amount of data (in MB) that can be transferred across regions. This is to avoid excessive inter-region transfer costs. Set this to -1 for unlimited transfers.

Computing_group_ref

notify.sqs.computing_group_ref

Example Computing_group_ref Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "sqs",
    "sqs": {
      "name": "my_sqs_queue",
      "user_payload": "Job {job_id} has completed.",
      "aws_access": {
        "computing_group_id": "My Main Group"
      }
    }
  }
}
Name Type Description
computing_group_id string
Use the AWS credentials from the specified Computing Group ID
max_cross_region_mb integer This sets the maximum amount of data (in MB) that can be transferred across regions. This is to avoid excessive inter-region transfer costs. Set this to -1 for unlimited transfers.

Credentials_vault_ref

notify.sqs.credentials_vault_ref

Example Credentials_vault_ref Object

{
  "uid": "notify_task",
  "kind": "notify",
  "payload": {
    "notify_method": "sqs",
    "sqs": {
      "name": "my_sqs_queue",
      "user_payload": "Job {job_id} has completed.",
      "aws_access": {
        "credentials_key": "my_aws_vault_key"
      }
    }
  }
}
Name Type Description
credentials_key string
Use API Key to reference credentials inside the Hybrik Credentials Vault.
region string
AWS region, optional
max_cross_region_mb integer This sets the maximum amount of data (in MB) that can be transferred across regions. This is to avoid excessive inter-region transfer costs. Set this to -1 for unlimited transfers.

Copy Task

Copy

Example Copy Object

{
  "name": "Hybrik Copy Example",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_source_bucket/my_source_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "copy_task",
        "kind": "copy",
        "payload": {
          "target": {
            "location": {
              "storage_provider": "s3",
              "path": "s3://my_destination_bucket/my_destination_folder"
            },
            "existing_files": "replace"
          }
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "copy_task"
            }
          ]
        }
      }
    ]
  }
}

A common requirement in a Workflow is to move data from one place to another. After a transcode, for example, you may want to send the result to a client. Or you may want to move a file from one AWS S3 bucket to another. These can all be accomplished with a Copy Task. There a different types of transfers that are supported, including S3, HTTP, FTP, SFTP, etc. When specifying a Copy, you can specify what to do if there already exists a file with that name in the new location. You can also specify what to do with the original file once you have successfully copied it to the new location.

copy

Name Type Description
target object Information about the target, including location, file naming, and method for handling existing files.
options object Options for the copy operation, including control over source deletion and error handling.

Target

Example Target Object

{
  "uid": "copy_task",
  "kind": "copy",
  "payload": {
    "target": {
      "location": {
        "storage_provider": "s3",
        "path": "s3://my_destination_bucket/my_destination_folder"
      },
      "existing_files": "delete_and_replace"
    }
  }
}

copy.target

Name Type Description
location object All sources will be copied to this location.
file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
existing_files enum                          
delete_and_replace
replace
replace_late
rename_new
rename_org
fail
The desired behavior when a target file already exists. "replace": will delete the original file and write the new one. "rename_new": gives the new file a different auto-generated name. "rename_org": renames the original file. Note that renaming the original file may not be possible depending on the target location. "delete_and_replace": attempts to immediately delete the original file. This will allow for fast failure in the case of inadequate permissions. "replace_late": does not attempt to delete the original -- simply executes a write.
default: fail

Options

Example Options Object

{
  "uid": "copy_task",
  "kind": "copy",
  "payload": {
    "target": {
      "location": {
        "storage_provider": "s3",
        "path": "s3://my_destination_bucket/my_destination_folder"
      },
      "existing_files": "delete_and_replace"
    },
    "options": {
      "restore_from_glacier": true
    }
  }
}

copy.options

Name Type Description
delete_sources boolean
Delete the task's source files upon successful completion of the task.
delete_source_on_completion boolean Will delete source on successful copy.
throttle_byte_per_sec integer
string
Throttle the copy operation. Useful, for example, for operation with SwiftStack storage on shared network lines.
restore_from_glacier boolean If a file is on s3 with the storage class 'GLACIER', issue a restore and wait for arrival before copying.
multi_file_concurrency integer If multiple files are to be copied, limit the concurrency of the copy operation.
multi_source_mode enum                          
use_first
use_all
concatenate
When multiple tasks feed into a single copy operation, the copy operation can be told to (1) just use the output from the first task for the copy and the others to control timing, (2) use all of the outputs from the input tasks, or (3) to concatenate the outputs from the input tasks into a single output.

Package Task

Package

Example Package Object


  "name": "Hybrik Package Example",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "transcode_all_renditions",
        "kind": "transcode",
        "payload": {
          "location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_output_folder/mp4s"
          },
          "targets": [
            {
              "file_pattern": "{source_basename}_800kbps{default_extension}",
              "existing_files": "replace",
              "container": {
                "kind": "fmp4",
                "segment_duration_sec": 6
              },
              "video": {
                "codec": "h264",
                "bitrate_mode": "cbr",
                "use_scene_detection": false,
                "bitrate_kb": 800,
                "height": 486
              }
            },
            {
              "file_pattern": "{source_basename}_400kbps{default_extension}",
              "existing_files": "replace",
              "container": {
                "kind": "fmp4",
                "segment_duration_sec": 6
              },
              "video": {
                "codec": "h264",
                "bitrate_mode": "cbr",
                "use_scene_detection": false,
                "bitrate_kb": 400,
                "height": 360
              }
            },
            {
              "file_pattern": "{source_basename}_200kbps{default_extension}",
              "existing_files": "replace",
              "container": {
                "kind": "fmp4",
                "segment_duration_sec": 6
              },
              "video": {
                "codec": "h264",
                "bitrate_mode": "cbr",
                "use_scene_detection": false,
                "bitrate_kb": 200,
                "height": 252
              }
            },
            {
              "file_pattern": "{source_basename}_audio_64kbps{default_extension}",
              "existing_files": "replace",
              "container": {
                "kind": "fmp4",
                "segment_duration_sec": 6
              },
              "audio": [
                {
                  "channels": 2,
                  "codec": "aac_lc",
                  "sample_rate": 48000,
                  "bitrate_kb": 96
                }
              ]
            }

          ]
        }
      },
      {
        "uid": "package_hls",
        "kind": "package",
        "payload": {
          "kind": "hls",
          "location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_output_folder/hls_manifests",
            "attributes": [
              {
                "name": "ContentType",
                "value": "application/x-mpegURL"
              }
            ]
          },
          "file_pattern": "master_manifest.m3u8",
          "segmentation_mode": "segmented_ts",
          "segment_duration_sec": "{{segment_duration}}",
          "force_original_media": false,
          "media_location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_output_folder/hls_media",
            "attributes": [
              {
                "name": "ContentType",
                "value": "video/MP2T"
              }
            ]
          },
          "media_file_pattern": "{source_basename}.ts",
          "hls": {
            "media_playlist_location": {
              "storage_provider": "s3",
              "path": "s3://my_bucket/my_output_folder/hls_manifests"
            }
          }
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_all_renditions"
            }
          ]
        }
      },
      {
        "from": [
          {
            "element": "transcode_all_renditions"
          }
        ],
        "to": {
          "success": [
            {
              "element": "package_hls"
            }
          ]
        }
      }
    ]
  }
}

In addition to creating many different types of output files, Hybrik can also package these files into Adaptive Bitrate (ABR) formats like HTTP Live Streaming (HLS) and Dynamic Adaptive Streaming over HTTP (DASH). The Package Task takes the results from a Transcode Task and re-multiplexes the output into the required HLS or DASH formats, including creating the various required manifest files. These manifest files tell the player how to switch between the various bitrates in order to create smooth playback. The Package Task supports multiple audio tracks as well as subtitles in both HLS and DASH. You can also encrypt your ABR files in Hybrik.

package

Name Type Description
options object Packaging options including source deletion and validation steps.
location object This will override any location defined within the parent of this manifest.
file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
force_original_media boolean Use the original transcoded files rather than remuxing them in the Package task to create the HLS/DASH outputs. Requires that the original files have correct segmentation.
kind enum                          
hls
dash
smooth
The kind of package to create. Options are: "dash", "hls", or "smooth".
dash object MPEG-DASH specific settings, only valid if kind is "dash".
smooth object Smooth Streaming specific settings, only valid if kind is "smooth".
hls object HLS specific settings, only valid if kind is "hls".
encryption object DRM and encryption settings for the produced media files.
segmentation_mode enum                          
segmented_ts
single_ts
multiplexed_ts
fmp4
segmented_mp4
Type of segmentation for media files.
segment_duration_sec number Desired duration of segments. Only valid for segmented_ts and segmented_mp4 modes.
media_url_prefix string The URL prefix to be added to media locations.
media_file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
media_file_extensions object The extensions to be used for each type of media in the output.
init_file_pattern string
The file pattern for the segmented mp4 init file.
title string An optional title. Note that not all multiplexers support adding a title.
author string An optional author. Note that not all multiplexers support adding an author.
copyright string An optional copyright string. Note that not all multiplexers support adding a copyright string.
info_url string An optional info URL string. Note that not all multiplexers support adding a url.
uid string This describes the manifest UID. Encodes to be included in this manifest creation must include this UID in their manifest_uids property. If no UID is specified here, then all encodes are included.
closed_captions array An array of closed-caption references to be included in the manifest.

Options

Example Options Object

{
  "uid": "package_hls",
  "kind": "package",
  "payload": {
    "kind": "hls",
    "options": {
      "delete_sources": true,
      "skip_validation": true
    },
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_manifests",
      "attributes": [
        {
          "name": "ContentType",
          "value": "application/x-mpegURL"
        }
      ]
    },
    "file_pattern": "master_manifest.m3u8",
    "segmentation_mode": "segmented_ts",
    "segment_duration_sec": "{{segment_duration}}",
    "force_original_media": false,
    "media_location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_media",
      "attributes": [
        {
          "name": "ContentType",
          "value": "video/MP2T"
        }
      ]
    },
    "media_file_pattern": "{source_basename}.ts",
    "hls": {
      "media_playlist_location": {
        "storage_provider": "s3",
        "path": "s3://my_bucket/my_output_folder/hls_manifests"
      }
    }
  }
}

package.options

Name Type Description
delete_sources boolean
Delete the packaging source files (not the master sources) upon successful completion of the task.
skip_validation boolean
Skip the validation step for the Package task.

Dash

Example Dash Object

{
  "uid": "dash_unencrypted_packaging",
  "kind": "package",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder",
      "attributes": [
        {
          "name": "ContentType",
          "value": "application/dash+xml"
        }
      ]
    },
    "file_pattern": "manifest.mpd",
    "kind": "dash",
    "uid": "main_manifest",
    "force_original_media": false,
    "media_location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder",
      "attributes": [
        {
          "name": "ContentType",
          "value": "video/mp4"
        }
      ]
    },
    "dash": {
      "title": "some dash title",
      "info_url": "www.dolby.com"
    }
  }
}

package.dash

Name Type Description
location object This will override any location defined within the parent of this manifest.
file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
compliance enum                          
generic
hmmp
senvu_2012
hmmpabs_2016
hmmpabs_2016_sdr
hmmpabs_2017
Specifies the MPEG-DASH compliance settings to use.
compliances array An array specifying the MPEG-DASH compliance settings to use.
base_url string MPEG-DASH MPD BaseURL.
title string MPEG-DASH MPD title.
copyright string MPEG-DASH MPD copyright.
info_url string MPEG-DASH MPD info URL.
remove_partial_mpd boolean Contrary to HLS, MPEG-DASH does not reference layer manifest files. Enable removing these potentially obsolete files.
profile_urns array MPEG-DASH profile URN's.
subtitle_container enum                          
webvtt
fmp4
Specifies whether to use WebVTT or FMP4 as the container for subtitles.
manifest_location object The location of the mpd's.
manifest_file_pattern string
The file pattern of the mpd's.
default: {source_basename}
adaptation_sets array An array defining the Adaptation Sets included in the DASH manifest. Commonly, you will have one video Adaptaion Set and multiple audio Adaptation Sets.
use_segment_list boolean Uses the segment list instead of segment templates.

Adaptation_sets

package.dash.adaptation_sets

Example Adaptation_sets Object

{
  "uid": "dash_unencrypted_packaging",
  "kind": "package",
  "payload": {
    "kind": "dash",
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder",
      "attributes": [
        {
          "name": "ContentType",
          "value": "application/dash+xml"
        }
      ]
    },
    "file_pattern": "manifest.mpd",
    "uid": "main_manifest",
    "force_original_media": false,
    "media_location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder",
      "attributes": [
        {
          "name": "ContentType",
          "value": "video/mp4"
        }
      ]
    },
    "dash": {
      "title": "some dash title",
      "info_url": "www.dolby.com"
    },
    "adaptation_sets": [
      {
        "track_group_id": "1",
        "id": 1,
        "role": "main"
      }
    ]
  }
}
Name Type Description
track_group_id string This indicates which Group this track belongs to. Multiple tracks with the same content but different bitrates would have the same track_group_id.
id integer The ID for the Adaptation Set.
role string The Role for the Adaptation Set.

Smooth

Example Smooth Object

{
  "uid": "smooth_encrypted_packaging",
  "kind": "package",
  "payload": {
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder",
      "attributes": [
        {
          "name": "ContentType",
          "value": "application/vnd.ms-sstr+xml"
        }
      ]
    },
    "file_pattern": "manifest.ism",
    "kind": "smooth",
    "uid": "main_manifest",
    "force_original_media": false,
    "encryption": {
      "enabled": true,
      "schema": "mpeg-cenc",
      "drm": [
        "playready"
      ],
      "key_id": "XXXXXX51686b5e1ba222439ecec1f12a",
      "key": "0000002cbf1a827e2fecfb87479a2",
      "playready_url": "http://playready.directtaps.net/pr/svc/rightsmanager.asmx"
    },
    "media_location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_folder",
      "attributes": [
        {
          "name": "ContentType",
          "value": "video/mp4"
        }
      ]
    }
  }
}

package.smooth

Name Type Description
location object This will override any location defined within the parent of this manifest.
file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
title string MPEG-DASH MPD title.
copyright string MPEG-DASH MPD copyright.
info_url string MPEG-DASH MPD info URL.
profile_urns array MPEG-DASH profile URN's.
manifest_location object The location of the ismc's.
manifest_file_pattern string
The file pattern of the ismc's.
default: {source_basename}
server_manifest_location object The location of the ism's.
server_manifest_file_pattern string
The file pattern of the ism's.
default: {source_basename}

Hls

Example Hls Object

{
  "uid": "package_hls",
  "kind": "package",
  "payload": {
    "kind": "hls",
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_manifests",
      "attributes": [
        {
          "name": "ContentType",
          "value": "application/x-mpegURL"
        }
      ]
    },
    "file_pattern": "master_manifest.m3u8",
    "segmentation_mode": "segmented_ts",
    "segment_duration_sec": "{{segment_duration}}",
    "force_original_media": false,
    "media_location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_media",
      "attributes": [
        {
          "name": "ContentType",
          "value": "video/MP2T"
        }
      ]
    },
    "media_file_pattern": "{source_basename}.ts",
    "hls": {
      "media_playlist_location": {
        "storage_provider": "s3",
        "path": "s3://my_bucket/my_output_folder/hls_manifests"
      },
      "include_iframe_manifests": true,
      "primary_layer_uid": "Layer4"
    }
  }
}

package.hls

Name Type Description
location object This will override any location defined within the parent of this manifest.
file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
version enum                          
3
4
5
6
7
8
9
10
The HLS version for packaging. default?.
ietf_draft_version integer Attributes and tags newer than the version listed here may be omitted.
primary_layer_uid string
If one of the included targets has a matching UID, it will be listed as the first layer in the hls.
include_iframe_manifests boolean If the individual layers have i-frame/trick play manifests, include these in the master manifest. This requires HLS version 4 or greater.
hevc_codec_id_prefix string Allows overriding the HEVC codec identifier.
media_playlist_location object The location of media playlist m3u8's.
media_playlist_url_prefix string The URL prefix to be added to media playlist locations.
media_playlist_file_pattern string
The file pattern of the media playlist m3u8's.
default: {source_basename}
manifest_location object The location of the master m3u8's.
manifest_file_pattern string
The file pattern of the master m3u8's. Example: {source_basename}_master_manifest.m3u8
default: {source_basename}

Encryption

Example Encryption Object

{
  "uid": "package_hls",
  "kind": "package",
  "payload": {
    "kind": "hls",
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_manifests",
      "attributes": [
        {
          "name": "ContentType",
          "value": "application/x-mpegURL"
        }
      ]
    },
    "file_pattern": "master_manifest.m3u8",
    "segmentation_mode": "segmented_ts",
    "segment_duration_sec": "{{segment_duration}}",
    "force_original_media": false,
    "media_location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_media",
      "attributes": [
        {
          "name": "ContentType",
          "value": "video/MP2T"
        }
      ]
    },
    "media_file_pattern": "{source_basename}.ts",
    "hls": {
      "media_playlist_location": {
        "storage_provider": "s3",
        "path": "s3://my_bucket/my_output_folder/hls_manifests"
      }
    },
    "encryption": {
      "enabled": true,
      "schema": "mpeg-cenc",
      "drm": [
        "playready"
      ],
      "key_id": "[32 char hex sequence]",
      "key": "[32 char hex sequence]",
      "content_id": "2a",
      "playready_pssh": "[base-64 encoded pssh]...=="
    }
  }
}

package.encryption

Name Type Description
enabled boolean Enable or disable encryption.
schema enum                          
aes-128-cbc
sample-aes
mpeg-cenc
mpeg-cbc1
mpeg-cens
mpeg-cbcs
none
The chosen encryption schema. Encryption keys will be generated by Hybrik.
default: aes-128-cbc
drm array An array specifying the types of DRM that will be used.
rotation integer
The encryption rotation interval. Every N file segments, a new encryption key will be generated.
default: 12
key_location object The optional key location. This will override any location defined within the parent of this task.
key_file_pattern string
This describes the key file name. Placeholders such as {source_basename} for source file name are supported.
key string The actual key, if pre-supplied.
iv string The initialization vector, if pre-supplied.
key_id string The Key ID. Used for MPEG-CENC only.
content_id string The Content ID. Used for MPEG-CENC only.
widevine_provider string The Widevine provider.
widevine_pssh string A Widevine PSSH string.
playready_url string The PlayReady licensing authority URL.
playready_pssh string A PlayReady PSSH string.
fairplay_uri string The FairPlay URI for the HSL URI attribute.
clearkey_pssh_version integer The PSSH box version for CENC.

Media_file_extensions

Example Media_file_extensions Object

{
  "uid": "package_hls",
  "kind": "package",
  "payload": {
    "kind": "hls",
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_manifests"
    },
    "file_pattern": "master_manifest.m3u8",
    "segmentation_mode": "segmented_ts",
    "segment_duration_sec": "6",
    "force_original_media": false,
    "media_location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_media"
    },
    "media_file_pattern": "{source_basename}.ts",
    "hls": {
      "media_playlist_location": {
        "storage_provider": "s3",
        "path": "s3://my_bucket/my_output_folder/hls_manifests"
      }
    },
    "media_file_extensions": {
      "audio": ".ts",
      "video": ".ts",
      "subtitle": ".vtt"
    }
  }
}

package.media_file_extensions

Name Type Description
audio string The extension to be used for audio files.
video string The extension to be used for video files.
subtitle string The extension to be used for subtitle files.

Closed_captions

Example Closed_captions Object

{
  "uid": "package_hls",
  "kind": "package",
  "payload": {
    "kind": "hls",
    "location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_manifests"
    },
    "file_pattern": "master_manifest.m3u8",
    "segmentation_mode": "segmented_ts",
    "segment_duration_sec": "6",
    "force_original_media": false,
    "media_location": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_output_folder/hls_media"
    },
    "media_file_pattern": "{source_basename}.ts",
    "hls": {
      "media_playlist_location": {
        "storage_provider": "s3",
        "path": "s3://my_bucket/my_output_folder/hls_manifests"
      }
    },
    "closed_captions": [
      {
        "cc_index": "1",
        "language": "es",
        "is_defualt": true,
        "is_autoselect": true,
        "group_id": "Spanish"
      }
    ]
  }
}

package.closed_captions

Name Type Description
cc_index integer The index value for the closed-caption track.
minimum: 1
maximum: 4
language string The language of the closed-caption track.
is_default boolean Setting to specify that this track is the default closed-caption track.
is_autoselect boolean Setting to automatically auto-select this track.
group_id string The GroupID of the track.

Folder Enum Task

Folder_enum

Example Folder_enum Object

{
  "name": "Hybrik Folder Enumeration Example",
  "payload": {
    "elements": [
      {
        "uid": "folder_enum_task",
        "kind": "folder_enum",
        "payload": {
          "source": {
            "storage_provider": "s3",
            "path": "s3://my_source_bucket/my_source_folder"
          },
          "settings": {
            "pattern_matching": "wildcard",
            "wildcard": "*",
            "recursive": true
          }
        }
      },
      {
        "uid": "copy_task",
        "kind": "copy",
        "payload": {
          "target": {
            "location": {
              "storage_provider": "s3",
              "path": "s3://my_destination_bucket/my_destination_folder"
            },
            "existing_files": "replace"
          }
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "folder_enum_task"
          }
        ],
        "to": {
          "success": [
            {
              "element": "copy_task"
            }
          ]
        }
      }
    ]
  }
}

Hybrik allows automating media workflows via the API or Watchfolders. There are times, however, when all you want to do is process an entire folder of content. For this, we have the Folder Enumeration Task (known as Folder_enum). Folder_enum looks like a regular Hybrik job, but instead of a Source Element naming a specific file, it has a Folder_enum Element pointing to a folder location. This job will actually create multiple jobs, one for each source file in the specified folder location. You can control the types of files that get selected, as well as whether to move recursively through sub folders. You can trigger hundreds (or thousands) of jobs with a single submission with the Folder_enum task. In addition to triggering transcode jobs, the Folder_enum task is handy for moving large amounts of data quickly between S3 locations. If you use Folder_enum as the source element for a copy job, then it will generate a copy job for each file. These jobs will be allocated to the available machines, which can vastly accelerate the movement of data by spreading the transfer load across many machines. This can move data hundreds of times faster (depending on the number of machines assigned) than a standard S3 data move.

folder_enum

Name Type Description
source object The source location.
settings object The recursion and pattern matching settings for the folder enumeration.

Settings

Example Settings Object

{
  "uid": "folder_enum_task",
  "kind": "folder_enum",
  "payload": {
    "source": {
      "storage_provider": "s3",
      "path": "s3://my_source_bucket/my_source_folder"
    },
    "settings": {
      "pattern_matching": "wildcard",
      "wildcard": "*.mov",
      "recursive": false
    }
  }
}

folder_enum.settings

Name Type Description
recursive boolean
Determines whether sub-folders should be scanned recursively for content.
pattern_matching enum                          
wildcard
regex
The type of pattern matching to use.
default: wildcard
wildcard string
The wildcard value to search for. For example .mov will only copy mov files.
*default:
*
regex string
The regular expression to be used for pattern matching.
files_per_job integer The maximum number of files to be copied in each job. The default is 1 file per job.
minimum: 1
maximum: 100

Watchfolder Task

Watchfolder

Example Watchfolder Object

{
  "name": "Hybrik Watchfolder Example",
  "payload": {
    "elements": [
      {
        "uid": "watchfolder_source",
        "kind": "watchfolder",
        "task": {
          "tags": [
            "WATCH_FOLDER"
          ]
        },
        "payload": {
          "source": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_watchfolder"
          },
          "settings": {
            "key": "watch_folder",
            "frequency": 5,
            "pattern_matching": "wildcard",
            "wildcard": "*",
            "recursive": true,
            "ignore_preexisting_files": true
          }
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "payload": {
          "location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_output_folder"
          },
          "targets": [
            {
              "file_pattern": "{source_basename}_converted.mp4",
              "existing_files": "replace",
              "container": {
                "kind": "mp4"
              },
              "video": {
                "width": 1280,
                "height": 720,
                "codec": "h264",
                "profile": "high",
                "level": "4.0",
                "frame_rate": "24000/1001"
              },
              "audio": [
                {
                  "codec": "aac",
                  "channels": 2,
                  "sample_rate": 48000,
                  "sample_size": 16,
                  "bitrate_kb": 128
                }
              ]
            }
          ]
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "wathcfolder_task"
          }
        ],
        "to": {
          "success": [
            {
              "element": "transcode_task"
            }
          ]
        }
      }
    ]
  }
}

Hybrik supports complete automated control through its REST-based API. There are many production workflows, however, that can be managed through the use of a simple Watchfolder without doing any programming. The goal of a Watchfolder Task is simple – when a new file shows up in the watchfolder location, trigger a new job. A Watchfolder Job looks very similar to a standard encoding job – the difference is that instead of specifying a particular source file, you are just specifying a folder where sources will be found.

watchfolder

Name Type Description
source object The location of the watchfolder.
settings object The settings for the watchfolder.

Settings

Example Settings Object

{
  "uid": "watchfolder_source",
  "kind": "watchfolder",
  "task": {
    "tags": [
      "WATCH_FOLDER"
    ]
  },
  "payload": {
    "source": {
      "storage_provider": "s3",
      "path": "s3://my_bucket/my_watchfolder"
    },
    "settings": {
      "key": "watch_folder",
      "frequency": 5,
      "pattern_matching": "wildcard",
      "wildcard": "*",
      "recursive": true,
      "ignore_preexisting_files": true
    }
  }
}

watchfolder.settings

Name Type Description
key string A unique key to identify this watchfolder for tracking processed source files.
watch_items_persistence enum                          
tracked
untracked
use_fs_notifiers_where_possible boolean
interval_sec integer

default: 300
recursive boolean
Recursively watch the folder and its sub-folders.
process_existing_files boolean
When the watchfolder process initiates, there may be contents already existing in the target location. Setting this value to true will indicate that pre-existing files should trigger new jobs.
pattern_matching enum                          
wildcard
regex

default: wildcard
wildcard string

default: *
regex string
A regular expression may be used to match only certain file names for processing.

DPP Packager Task

Dpp_packager

Example Dpp_packager Object

{
  "name": "Hybrik DPP Packager Example",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_input_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "dpp_package_task",
        "kind": "dpp_packager",
        "payload": {
          "target": {
            "location": {
              "storage_provider": "s3",
              "path": "s3://my_bucket/my_destination"
            },
            "file_pattern": "{source_basename}_dpp.mxf"
          },
          "dpp_schema": "d10"
        }
      }
    ],
    "connections": [
      {
        "from": [
          {
            "element": "source_file"
          }
        ],
        "to": {
          "success": [
            {
              "element": "dpp_package_task"
            }
          ]
        }
      }
    ]
  }
}

The Digital Production Partnership (DPP) is an non-profit company originally created by the major British broadcaster to define standards and guidelines for the television production standard. Hybrik supports the AS-11 DPP standard, which includes both media creation and metadata packaging. The DPP Package step is run after the Transcode Task.

dpp_packager

Name Type Description
options object The options for the DPP packager.
target object The target for the DPP package.
dpp_schema enum                          
d10
op1a
rdd9
Specifies which DPP schema will be used.

Options

Example Options Object

{
  "uid": "dpp_package_task",
  "kind": "dpp_packager",
  "options": {
    "delete_sources": true
  },
  "payload": {
    "target": {
      "location": {
        "storage_provider": "s3",
        "path": "s3://my_bucket/my_destination"
      },
      "file_pattern": "{source_basename}_dpp.mxf"
    },
    "dpp_schema": "d10"
  }
}

dpp_packager.options

Name Type Description
delete_sources boolean
Delete the task's source files upon successful completion of the task.

Target

Example Target Object

{
  "uid": "dpp_package_task",
  "kind": "dpp_packager",
  "payload": {
    "target": {
      "location": {
        "storage_provider": "s3",
        "path": "s3://my_bucket/my_destination"
      },
      "file_pattern": "{source_basename}_dpp.mxf"
    },
    "dpp_schema": "d10"
  }
}

dpp_packager.target

Name Type Description
location object The result will be copied to this location.
file_pattern string
This describes the target file name. Placeholders such as {source_basename} for source file name are supported.
default: {source_basename}
existing_files enum                          
delete_and_replace
replace
replace_late
rename_new
rename_org
fail
The desired behavior when a target file already exists. "replace": will delete the original file and write the new one. "rename_new": gives the new file a different auto-generated name. "rename_org": renames the original file. Note that renaming the original file may not be possible depending on the target location. "delete_and_replace": attempts to immediately delete the original file. This will allow for fast failure in the case of inadequate permissions. "replace_late": does not attempt to delete the original -- simply executes a write.
default: fail

BIF Creator Task

Bif_creator

Example Bif_creator Object

{
  "name": "Hybrik BIF Creator Example",
  "payload": {
    "elements": [
      {
        "uid": "source_file",
        "kind": "source",
        "payload": {
          "kind": "asset_url",
          "payload": {
            "storage_provider": "s3",
            "url": "s3://my_bucket/my_input_folder/my_file.mp4"
          }
        }
      },
      {
        "uid": "transcode_task",
        "kind": "transcode",
        "payload": {
          "location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_output_folder"
          },
          "targets": [
            {
              "file_pattern": "{source_basename}_converted.mp4",
              "existing_files": "replace",
              "container": {
                "kind": "mp4"
              },
              "video": {
                "width": 1280,
                "height": 720,
                "codec": "h264",
                "profile": "high",
                "level": "4.0",
                "frame_rate": 23.976
              },
              "audio": [
                {
                  "codec": "aac",
                  "channels": 2,
                  "sample_rate": 48000,
                  "sample_size": 16,
                  "bitrate_kb": 128
                }
              ]
            }
          ]
        }
      },
      {
        "uid": "bif_creator_task",
        "kind": "bif_creator",
        "payload": {
          "source_stream_selection": "highest",
          "location": {
            "storage_provider": "s3",
            "path": "s3://my_bucket/my_output_folder"