blob: 9fecaf9782cb02c76b7a025565813049ebdf3843 [file] [log] [blame]
<html><body>
<style>
body, h1, h2, h3, div, span, p, pre, a {
margin: 0;
padding: 0;
border: 0;
font-weight: inherit;
font-style: inherit;
font-size: 100%;
font-family: inherit;
vertical-align: baseline;
}
body {
font-size: 13px;
padding: 1em;
}
h1 {
font-size: 26px;
margin-bottom: 1em;
}
h2 {
font-size: 24px;
margin-bottom: 1em;
}
h3 {
font-size: 20px;
margin-bottom: 1em;
margin-top: 1em;
}
pre, code {
line-height: 1.5;
font-family: Monaco, 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', 'Lucida Console', monospace;
}
pre {
margin-top: 0.5em;
}
h1, h2, h3, p {
font-family: Arial, sans serif;
}
h1, h2, h3 {
border-bottom: solid #CCC 1px;
}
.toc_element {
margin-top: 0.5em;
}
.firstline {
margin-left: 2 em;
}
.method {
margin-top: 1em;
border: solid 1px #CCC;
padding: 1em;
background: #EEE;
}
.details {
font-weight: bold;
font-size: 14px;
}
</style>
<h1><a href="transcoder_v1beta1.html">Transcoder API</a> . <a href="transcoder_v1beta1.projects.html">projects</a> . <a href="transcoder_v1beta1.projects.locations.html">locations</a> . <a href="transcoder_v1beta1.projects.locations.jobTemplates.html">jobTemplates</a></h1>
<h2>Instance Methods</h2>
<p class="toc_element">
<code><a href="#close">close()</a></code></p>
<p class="firstline">Close httplib2 connections.</p>
<p class="toc_element">
<code><a href="#create">create(parent, body=None, jobTemplateId=None, x__xgafv=None)</a></code></p>
<p class="firstline">Creates a job template in the specified region.</p>
<p class="toc_element">
<code><a href="#delete">delete(name, x__xgafv=None)</a></code></p>
<p class="firstline">Deletes a job template.</p>
<p class="toc_element">
<code><a href="#get">get(name, x__xgafv=None)</a></code></p>
<p class="firstline">Returns the job template data.</p>
<p class="toc_element">
<code><a href="#list">list(parent, pageSize=None, pageToken=None, x__xgafv=None)</a></code></p>
<p class="firstline">Lists job templates in the specified region.</p>
<p class="toc_element">
<code><a href="#list_next">list_next(previous_request, previous_response)</a></code></p>
<p class="firstline">Retrieves the next page of results.</p>
<h3>Method Details</h3>
<div class="method">
<code class="details" id="close">close()</code>
<pre>Close httplib2 connections.</pre>
</div>
<div class="method">
<code class="details" id="create">create(parent, body=None, jobTemplateId=None, x__xgafv=None)</code>
<pre>Creates a job template in the specified region.
Args:
parent: string, Required. The parent location to create this job template. Format: `projects/{project}/locations/{location}` (required)
body: object, The request body.
The object takes the form of:
{ # Transcoding job template resource.
&quot;config&quot;: { # Job configuration # The configuration for this template.
&quot;adBreaks&quot;: [ # List of ad breaks. Specifies where to insert ad break tags in the output manifests.
{ # Ad break.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds for the ad break, relative to the output file timeline. The default is `0s`.
},
],
&quot;editList&quot;: [ # List of `Edit atom`s. Defines the ultimate timeline of the resulting file or manifest.
{ # Edit atom.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # End time in seconds for the atom, relative to the input file timeline. When `end_time_offset` is not specified, the `inputs` are used until the end of the atom.
&quot;inputs&quot;: [ # List of `Input.key`s identifying files that should be used in this atom. The listed `inputs` must have the same timeline.
&quot;A String&quot;,
],
&quot;key&quot;: &quot;A String&quot;, # A unique key for this atom. Must be specified when using advanced mapping.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds for the atom, relative to the input file timeline. The default is `0s`.
},
],
&quot;elementaryStreams&quot;: [ # List of elementary streams.
{ # Encoding of an input file such as an audio, video, or text track. Elementary streams must be packaged before mapping and sharing between different output formats.
&quot;audioStream&quot;: { # Audio stream resource. # Encoding of an audio stream.
&quot;bitrateBps&quot;: 42, # Required. Audio bitrate in bits per second. Must be between 1 and 10,000,000.
&quot;channelCount&quot;: 42, # Number of audio channels. Must be between 1 and 6. The default is 2.
&quot;channelLayout&quot;: [ # A list of channel names specifying layout of the audio channels. This only affects the metadata embedded in the container headers, if supported by the specified format. The default is `[&quot;fl&quot;, &quot;fr&quot;]`. Supported channel names: - &#x27;fl&#x27; - Front left channel - &#x27;fr&#x27; - Front right channel - &#x27;sl&#x27; - Side left channel - &#x27;sr&#x27; - Side right channel - &#x27;fc&#x27; - Front center channel - &#x27;lfe&#x27; - Low frequency
&quot;A String&quot;,
],
&quot;codec&quot;: &quot;A String&quot;, # The codec for this audio stream. The default is `&quot;aac&quot;`. Supported audio codecs: - &#x27;aac&#x27; - &#x27;aac-he&#x27; - &#x27;aac-he-v2&#x27; - &#x27;mp3&#x27; - &#x27;ac3&#x27; - &#x27;eac3&#x27;
&quot;mapping&quot;: [ # The mapping for the `Job.edit_list` atoms with audio `EditAtom.inputs`.
{ # The mapping for the `Job.edit_list` atoms with audio `EditAtom.inputs`.
&quot;channels&quot;: [ # List of `Channel`s for this audio stream. for in-depth explanation.
{ # The audio channel.
&quot;inputs&quot;: [ # List of `Job.inputs` for this audio channel.
{ # Identifies which input file, track, and channel should be used.
&quot;channel&quot;: 42, # Required. The zero-based index of the channel in the input file.
&quot;gainDb&quot;: 3.14, # Audio volume control in dB. Negative values decrease volume, positive values increase. The default is 0.
&quot;key&quot;: &quot;A String&quot;, # Required. The `Input.key` that identifies the input file.
&quot;track&quot;: 42, # Required. The zero-based index of the track in the input file.
},
],
},
],
&quot;key&quot;: &quot;A String&quot;, # Required. The `EditAtom.key` that references the atom with audio inputs in the `Job.edit_list`.
},
],
&quot;sampleRateHertz&quot;: 42, # The audio sample rate in Hertz. The default is 48000 Hertz.
},
&quot;key&quot;: &quot;A String&quot;, # A unique key for this elementary stream.
&quot;textStream&quot;: { # Encoding of a text stream. For example, closed captions or subtitles. # Encoding of a text stream. For example, closed captions or subtitles.
&quot;codec&quot;: &quot;A String&quot;, # The codec for this text stream. The default is `&quot;webvtt&quot;`. Supported text codecs: - &#x27;srt&#x27; - &#x27;ttml&#x27; - &#x27;cea608&#x27; - &#x27;cea708&#x27; - &#x27;webvtt&#x27;
&quot;languageCode&quot;: &quot;A String&quot;, # Required. The BCP-47 language code, such as `&quot;en-US&quot;` or `&quot;sr-Latn&quot;`. For more information, see https://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
&quot;mapping&quot;: [ # The mapping for the `Job.edit_list` atoms with text `EditAtom.inputs`.
{ # The mapping for the `Job.edit_list` atoms with text `EditAtom.inputs`.
&quot;inputs&quot;: [ # List of `Job.inputs` that should be embedded in this atom. Only one input is supported.
{ # Identifies which input file and track should be used.
&quot;key&quot;: &quot;A String&quot;, # Required. The `Input.key` that identifies the input file.
&quot;track&quot;: 42, # Required. The zero-based index of the track in the input file.
},
],
&quot;key&quot;: &quot;A String&quot;, # Required. The `EditAtom.key` that references atom with text inputs in the `Job.edit_list`.
},
],
},
&quot;videoStream&quot;: { # Video stream resource. # Encoding of a video stream.
&quot;allowOpenGop&quot;: True or False, # Specifies whether an open Group of Pictures (GOP) structure should be allowed or not. The default is `false`.
&quot;aqStrength&quot;: 3.14, # Specify the intensity of the adaptive quantizer (AQ). Must be between 0 and 1, where 0 disables the quantizer and 1 maximizes the quantizer. A higher value equals a lower bitrate but smoother image. The default is 0.
&quot;bFrameCount&quot;: 42, # The number of consecutive B-frames. Must be greater than or equal to zero. Must be less than `VideoStream.gop_frame_count` if set. The default is 0.
&quot;bPyramid&quot;: True or False, # Allow B-pyramid for reference frame selection. This may not be supported on all decoders. The default is `false`.
&quot;bitrateBps&quot;: 42, # Required. The video bitrate in bits per second. The minimum value is 1,000. The maximum value for H264/H265 is 800,000,000. The maximum value for VP9 is 480,000,000.
&quot;codec&quot;: &quot;A String&quot;, # Codec type. The following codecs are supported: * `h264` (default) * `h265` * `vp9`
&quot;crfLevel&quot;: 42, # Target CRF level. Must be between 10 and 36, where 10 is the highest quality and 36 is the most efficient compression. The default is 21.
&quot;enableTwoPass&quot;: True or False, # Use two-pass encoding strategy to achieve better video quality. `VideoStream.rate_control_mode` must be `&quot;vbr&quot;`. The default is `false`.
&quot;entropyCoder&quot;: &quot;A String&quot;, # The entropy coder to use. The default is `&quot;cabac&quot;`. Supported entropy coders: - &#x27;cavlc&#x27; - &#x27;cabac&#x27;
&quot;frameRate&quot;: 3.14, # Required. The target video frame rate in frames per second (FPS). Must be less than or equal to 120. Will default to the input frame rate if larger than the input frame rate. The API will generate an output FPS that is divisible by the input FPS, and smaller or equal to the target FPS. See [Calculate frame rate](https://cloud.google.com/transcoder/docs/concepts/frame-rate) for more information.
&quot;gopDuration&quot;: &quot;A String&quot;, # Select the GOP size based on the specified duration. The default is `&quot;3s&quot;`. Note that `gopDuration` must be less than or equal to [`segmentDuration`](#SegmentSettings), and [`segmentDuration`](#SegmentSettings) must be divisible by `gopDuration`.
&quot;gopFrameCount&quot;: 42, # Select the GOP size based on the specified frame count. Must be greater than zero.
&quot;heightPixels&quot;: 42, # The height of the video in pixels. Must be an even integer. When not specified, the height is adjusted to match the specified width and input aspect ratio. If both are omitted, the input height is used.
&quot;pixelFormat&quot;: &quot;A String&quot;, # Pixel format to use. The default is `&quot;yuv420p&quot;`. Supported pixel formats: - &#x27;yuv420p&#x27; pixel format. - &#x27;yuv422p&#x27; pixel format. - &#x27;yuv444p&#x27; pixel format. - &#x27;yuv420p10&#x27; 10-bit HDR pixel format. - &#x27;yuv422p10&#x27; 10-bit HDR pixel format. - &#x27;yuv444p10&#x27; 10-bit HDR pixel format. - &#x27;yuv420p12&#x27; 12-bit HDR pixel format. - &#x27;yuv422p12&#x27; 12-bit HDR pixel format. - &#x27;yuv444p12&#x27; 12-bit HDR pixel format.
&quot;preset&quot;: &quot;A String&quot;, # Enforces the specified codec preset. The default is `veryfast`. The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;profile&quot;: &quot;A String&quot;, # Enforces the specified codec profile. The following profiles are supported: * `baseline` * `main` * `high` (default) The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;rateControlMode&quot;: &quot;A String&quot;, # Specify the `rate_control_mode`. The default is `&quot;vbr&quot;`. Supported rate control modes: - &#x27;vbr&#x27; - variable bitrate - &#x27;crf&#x27; - constant rate factor
&quot;tune&quot;: &quot;A String&quot;, # Enforces the specified codec tune. The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;vbvFullnessBits&quot;: 42, # Initial fullness of the Video Buffering Verifier (VBV) buffer in bits. Must be greater than zero. The default is equal to 90% of `VideoStream.vbv_size_bits`.
&quot;vbvSizeBits&quot;: 42, # Size of the Video Buffering Verifier (VBV) buffer in bits. Must be greater than zero. The default is equal to `VideoStream.bitrate_bps`.
&quot;widthPixels&quot;: 42, # The width of the video in pixels. Must be an even integer. When not specified, the width is adjusted to match the specified height and input aspect ratio. If both are omitted, the input width is used.
},
},
],
&quot;inputs&quot;: [ # List of input assets stored in Cloud Storage.
{ # Input asset.
&quot;key&quot;: &quot;A String&quot;, # A unique key for this input. Must be specified when using advanced mapping and edit lists.
&quot;preprocessingConfig&quot;: { # Preprocessing configurations. # Preprocessing configurations.
&quot;audio&quot;: { # Audio preprocessing configuration. # Audio preprocessing configuration.
&quot;highBoost&quot;: True or False, # Enable boosting high frequency components. The default is `false`.
&quot;lowBoost&quot;: True or False, # Enable boosting low frequency components. The default is `false`.
&quot;lufs&quot;: 3.14, # Specify audio loudness normalization in loudness units relative to full scale (LUFS). Enter a value between -24 and 0 (the default), where: * -24 is the Advanced Television Systems Committee (ATSC A/85) standard * -23 is the EU R128 broadcast standard * -19 is the prior standard for online mono audio * -18 is the ReplayGain standard * -16 is the prior standard for stereo audio * -14 is the new online audio standard recommended by Spotify, as well as Amazon Echo * 0 disables normalization
},
&quot;color&quot;: { # Color preprocessing configuration. # Color preprocessing configuration.
&quot;brightness&quot;: 3.14, # Control brightness of the video. Enter a value between -1 and 1, where -1 is minimum brightness and 1 is maximum brightness. 0 is no change. The default is 0.
&quot;contrast&quot;: 3.14, # Control black and white contrast of the video. Enter a value between -1 and 1, where -1 is minimum contrast and 1 is maximum contrast. 0 is no change. The default is 0.
&quot;saturation&quot;: 3.14, # Control color saturation of the video. Enter a value between -1 and 1, where -1 is fully desaturated and 1 is maximum saturation. 0 is no change. The default is 0.
},
&quot;crop&quot;: { # Video cropping configuration for the input video. The cropped input video is scaled to match the output resolution. # Specify the video cropping configuration.
&quot;bottomPixels&quot;: 42, # The number of pixels to crop from the bottom. The default is 0.
&quot;leftPixels&quot;: 42, # The number of pixels to crop from the left. The default is 0.
&quot;rightPixels&quot;: 42, # The number of pixels to crop from the right. The default is 0.
&quot;topPixels&quot;: 42, # The number of pixels to crop from the top. The default is 0.
},
&quot;deblock&quot;: { # Deblock preprocessing configuration. # Deblock preprocessing configuration.
&quot;enabled&quot;: True or False, # Enable deblocker. The default is `false`.
&quot;strength&quot;: 3.14, # Set strength of the deblocker. Enter a value between 0 and 1. The higher the value, the stronger the block removal. 0 is no deblocking. The default is 0.
},
&quot;denoise&quot;: { # Denoise preprocessing configuration. # Denoise preprocessing configuration.
&quot;strength&quot;: 3.14, # Set strength of the denoise. Enter a value between 0 and 1. The higher the value, the smoother the image. 0 is no denoising. The default is 0.
&quot;tune&quot;: &quot;A String&quot;, # Set the denoiser mode. The default is `&quot;standard&quot;`. Supported denoiser modes: - &#x27;standard&#x27; - &#x27;grain&#x27;
},
&quot;pad&quot;: { # Pad filter configuration for the input video. The padded input video is scaled after padding with black to match the output resolution. # Specify the video pad filter configuration.
&quot;bottomPixels&quot;: 42, # The number of pixels to add to the bottom. The default is 0.
&quot;leftPixels&quot;: 42, # The number of pixels to add to the left. The default is 0.
&quot;rightPixels&quot;: 42, # The number of pixels to add to the right. The default is 0.
&quot;topPixels&quot;: 42, # The number of pixels to add to the top. The default is 0.
},
},
&quot;uri&quot;: &quot;A String&quot;, # URI of the media. Input files must be at least 5 seconds in duration and stored in Cloud Storage (for example, `gs://bucket/inputs/file.mp4`). If empty, the value will be populated from `Job.input_uri`.
},
],
&quot;manifests&quot;: [ # List of output manifests.
{ # Manifest configuration.
&quot;fileName&quot;: &quot;A String&quot;, # The name of the generated file. The default is `&quot;manifest&quot;` with the extension suffix corresponding to the `Manifest.type`.
&quot;muxStreams&quot;: [ # Required. List of user given `MuxStream.key`s that should appear in this manifest. When `Manifest.type` is `HLS`, a media manifest with name `MuxStream.key` and `.m3u8` extension is generated for each element of the `Manifest.mux_streams`.
&quot;A String&quot;,
],
&quot;type&quot;: &quot;A String&quot;, # Required. Type of the manifest, can be &quot;HLS&quot; or &quot;DASH&quot;.
},
],
&quot;muxStreams&quot;: [ # List of multiplexing settings for output streams.
{ # Multiplexing settings for output stream.
&quot;container&quot;: &quot;A String&quot;, # The container format. The default is `&quot;mp4&quot;` Supported container formats: - &#x27;ts&#x27; - &#x27;fmp4&#x27;- the corresponding file extension is `&quot;.m4s&quot;` - &#x27;mp4&#x27; - &#x27;vtt&#x27;
&quot;elementaryStreams&quot;: [ # List of `ElementaryStream.key`s multiplexed in this stream.
&quot;A String&quot;,
],
&quot;encryption&quot;: { # Encryption settings. # Encryption settings.
&quot;aes128&quot;: { # Configuration for AES-128 encryption. # Configuration for AES-128 encryption.
&quot;keyUri&quot;: &quot;A String&quot;, # Required. URI of the key delivery service. This URI is inserted into the M3U8 header.
},
&quot;iv&quot;: &quot;A String&quot;, # Required. 128 bit Initialization Vector (IV) represented as lowercase hexadecimal digits.
&quot;key&quot;: &quot;A String&quot;, # Required. 128 bit encryption key represented as lowercase hexadecimal digits.
&quot;mpegCenc&quot;: { # Configuration for MPEG Common Encryption (MPEG-CENC). # Configuration for MPEG Common Encryption (MPEG-CENC).
&quot;keyId&quot;: &quot;A String&quot;, # Required. 128 bit Key ID represented as lowercase hexadecimal digits for use with common encryption.
&quot;scheme&quot;: &quot;A String&quot;, # Required. Specify the encryption scheme. Supported encryption schemes: - &#x27;cenc&#x27; - &#x27;cbcs&#x27;
},
&quot;sampleAes&quot;: { # Configuration for SAMPLE-AES encryption. # Configuration for SAMPLE-AES encryption.
&quot;keyUri&quot;: &quot;A String&quot;, # Required. URI of the key delivery service. This URI is inserted into the M3U8 header.
},
},
&quot;fileName&quot;: &quot;A String&quot;, # The name of the generated file. The default is `MuxStream.key` with the extension suffix corresponding to the `MuxStream.container`. Individual segments also have an incremental 10-digit zero-padded suffix starting from 0 before the extension, such as `&quot;mux_stream0000000123.ts&quot;`.
&quot;key&quot;: &quot;A String&quot;, # A unique key for this multiplexed stream. HLS media manifests will be named `MuxStream.key` with the `&quot;.m3u8&quot;` extension suffix.
&quot;segmentSettings&quot;: { # Segment settings for `&quot;ts&quot;`, `&quot;fmp4&quot;` and `&quot;vtt&quot;`. # Segment settings for `&quot;ts&quot;`, `&quot;fmp4&quot;` and `&quot;vtt&quot;`.
&quot;individualSegments&quot;: True or False, # Required. Create an individual segment file. The default is `false`.
&quot;segmentDuration&quot;: &quot;A String&quot;, # Duration of the segments in seconds. The default is `&quot;6.0s&quot;`. Note that `segmentDuration` must be greater than or equal to [`gopDuration`](#videostream), and `segmentDuration` must be divisible by [`gopDuration`](#videostream).
},
},
],
&quot;output&quot;: { # Location of output file(s) in a Cloud Storage bucket. # Output configuration.
&quot;uri&quot;: &quot;A String&quot;, # URI for the output file(s). For example, `gs://my-bucket/outputs/`. If empty the value is populated from `Job.output_uri`.
},
&quot;overlays&quot;: [ # List of overlays on the output video, in descending Z-order.
{ # Overlay configuration.
&quot;animations&quot;: [ # List of Animations. The list should be chronological, without any time overlap.
{ # Animation types.
&quot;animationEnd&quot;: { # End previous overlay animation from the video. Without AnimationEnd, the overlay object will keep the state of previous animation until the end of the video. # End previous animation.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to end overlay object, in seconds. Default: 0
},
&quot;animationFade&quot;: { # Display overlay object with fade animation. # Display overlay object with fade animation.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # The time to end the fade animation, in seconds. Default: `start_time_offset` + 1s
&quot;fadeType&quot;: &quot;A String&quot;, # Required. Type of fade animation: `FADE_IN` or `FADE_OUT`.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to start the fade animation, in seconds. Default: 0
&quot;xy&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized coordinates based on output video resolution. Valid values: `0.0`–`1.0`. `xy` is the upper-left coordinate of the overlay object. For example, use the x and y coordinates {0,0} to position the top-left corner of the overlay animation in the top-left corner of the output video.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
},
&quot;animationStatic&quot;: { # Display static overlay object. # Display static overlay object.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to start displaying the overlay object, in seconds. Default: 0
&quot;xy&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized coordinates based on output video resolution. Valid values: `0.0`–`1.0`. `xy` is the upper-left coordinate of the overlay object. For example, use the x and y coordinates {0,0} to position the top-left corner of the overlay animation in the top-left corner of the output video.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
},
},
],
&quot;image&quot;: { # Overlaid jpeg image. # Image overlay.
&quot;alpha&quot;: 3.14, # Target image opacity. Valid values are from `1.0` (solid, default) to `0.0` (transparent), exclusive. Set this to a value greater than `0.0`.
&quot;resolution&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized image resolution, based on output video resolution. Valid values: `0.0`–`1.0`. To respect the original image aspect ratio, set either `x` or `y` to `0.0`. To use the original image resolution, set both `x` and `y` to `0.0`.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
&quot;uri&quot;: &quot;A String&quot;, # Required. URI of the JPEG image in Cloud Storage. For example, `gs://bucket/inputs/image.jpeg`. JPEG is the only supported image type.
},
},
],
&quot;pubsubDestination&quot;: { # A Pub/Sub destination. # Destination on Pub/Sub.
&quot;topic&quot;: &quot;A String&quot;, # The name of the Pub/Sub topic to publish job completion notification to. For example: `projects/{project}/topics/{topic}`.
},
&quot;spriteSheets&quot;: [ # List of output sprite sheets.
{ # Sprite sheet configuration.
&quot;columnCount&quot;: 42, # The maximum number of sprites per row in a sprite sheet. The default is 0, which indicates no maximum limit.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # End time in seconds, relative to the output file timeline. When `end_time_offset` is not specified, the sprites are generated until the end of the output file.
&quot;filePrefix&quot;: &quot;A String&quot;, # Required. File name prefix for the generated sprite sheets. Each sprite sheet has an incremental 10-digit zero-padded suffix starting from 0 before the extension, such as `&quot;sprite_sheet0000000123.jpeg&quot;`.
&quot;format&quot;: &quot;A String&quot;, # Format type. The default is `&quot;jpeg&quot;`. Supported formats: - &#x27;jpeg&#x27;
&quot;interval&quot;: &quot;A String&quot;, # Starting from `0s`, create sprites at regular intervals. Specify the interval value in seconds.
&quot;quality&quot;: 42, # The quality of the generated sprite sheet. Enter a value between 1 and 100, where 1 is the lowest quality and 100 is the highest quality. The default is 100. A high quality value corresponds to a low image data compression ratio.
&quot;rowCount&quot;: 42, # The maximum number of rows per sprite sheet. When the sprite sheet is full, a new sprite sheet is created. The default is 0, which indicates no maximum limit.
&quot;spriteHeightPixels&quot;: 42, # Required. The height of sprite in pixels. Must be an even integer. To preserve the source aspect ratio, set the SpriteSheet.sprite_height_pixels field or the SpriteSheet.sprite_width_pixels field, but not both (the API will automatically calculate the missing field).
&quot;spriteWidthPixels&quot;: 42, # Required. The width of sprite in pixels. Must be an even integer. To preserve the source aspect ratio, set the SpriteSheet.sprite_width_pixels field or the SpriteSheet.sprite_height_pixels field, but not both (the API will automatically calculate the missing field).
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds, relative to the output file timeline. Determines the first sprite to pick. The default is `0s`.
&quot;totalCount&quot;: 42, # Total number of sprites. Create the specified number of sprites distributed evenly across the timeline of the output media. The default is 100.
},
],
},
&quot;name&quot;: &quot;A String&quot;, # The resource name of the job template. Format: `projects/{project}/locations/{location}/jobTemplates/{job_template}`
}
jobTemplateId: string, Required. The ID to use for the job template, which will become the final component of the job template&#x27;s resource name. This value should be 4-63 characters, and valid characters must match the regular expression `a-zA-Z*`.
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # Transcoding job template resource.
&quot;config&quot;: { # Job configuration # The configuration for this template.
&quot;adBreaks&quot;: [ # List of ad breaks. Specifies where to insert ad break tags in the output manifests.
{ # Ad break.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds for the ad break, relative to the output file timeline. The default is `0s`.
},
],
&quot;editList&quot;: [ # List of `Edit atom`s. Defines the ultimate timeline of the resulting file or manifest.
{ # Edit atom.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # End time in seconds for the atom, relative to the input file timeline. When `end_time_offset` is not specified, the `inputs` are used until the end of the atom.
&quot;inputs&quot;: [ # List of `Input.key`s identifying files that should be used in this atom. The listed `inputs` must have the same timeline.
&quot;A String&quot;,
],
&quot;key&quot;: &quot;A String&quot;, # A unique key for this atom. Must be specified when using advanced mapping.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds for the atom, relative to the input file timeline. The default is `0s`.
},
],
&quot;elementaryStreams&quot;: [ # List of elementary streams.
{ # Encoding of an input file such as an audio, video, or text track. Elementary streams must be packaged before mapping and sharing between different output formats.
&quot;audioStream&quot;: { # Audio stream resource. # Encoding of an audio stream.
&quot;bitrateBps&quot;: 42, # Required. Audio bitrate in bits per second. Must be between 1 and 10,000,000.
&quot;channelCount&quot;: 42, # Number of audio channels. Must be between 1 and 6. The default is 2.
&quot;channelLayout&quot;: [ # A list of channel names specifying layout of the audio channels. This only affects the metadata embedded in the container headers, if supported by the specified format. The default is `[&quot;fl&quot;, &quot;fr&quot;]`. Supported channel names: - &#x27;fl&#x27; - Front left channel - &#x27;fr&#x27; - Front right channel - &#x27;sl&#x27; - Side left channel - &#x27;sr&#x27; - Side right channel - &#x27;fc&#x27; - Front center channel - &#x27;lfe&#x27; - Low frequency
&quot;A String&quot;,
],
&quot;codec&quot;: &quot;A String&quot;, # The codec for this audio stream. The default is `&quot;aac&quot;`. Supported audio codecs: - &#x27;aac&#x27; - &#x27;aac-he&#x27; - &#x27;aac-he-v2&#x27; - &#x27;mp3&#x27; - &#x27;ac3&#x27; - &#x27;eac3&#x27;
&quot;mapping&quot;: [ # The mapping for the `Job.edit_list` atoms with audio `EditAtom.inputs`.
{ # The mapping for the `Job.edit_list` atoms with audio `EditAtom.inputs`.
&quot;channels&quot;: [ # List of `Channel`s for this audio stream. for in-depth explanation.
{ # The audio channel.
&quot;inputs&quot;: [ # List of `Job.inputs` for this audio channel.
{ # Identifies which input file, track, and channel should be used.
&quot;channel&quot;: 42, # Required. The zero-based index of the channel in the input file.
&quot;gainDb&quot;: 3.14, # Audio volume control in dB. Negative values decrease volume, positive values increase. The default is 0.
&quot;key&quot;: &quot;A String&quot;, # Required. The `Input.key` that identifies the input file.
&quot;track&quot;: 42, # Required. The zero-based index of the track in the input file.
},
],
},
],
&quot;key&quot;: &quot;A String&quot;, # Required. The `EditAtom.key` that references the atom with audio inputs in the `Job.edit_list`.
},
],
&quot;sampleRateHertz&quot;: 42, # The audio sample rate in Hertz. The default is 48000 Hertz.
},
&quot;key&quot;: &quot;A String&quot;, # A unique key for this elementary stream.
&quot;textStream&quot;: { # Encoding of a text stream. For example, closed captions or subtitles. # Encoding of a text stream. For example, closed captions or subtitles.
&quot;codec&quot;: &quot;A String&quot;, # The codec for this text stream. The default is `&quot;webvtt&quot;`. Supported text codecs: - &#x27;srt&#x27; - &#x27;ttml&#x27; - &#x27;cea608&#x27; - &#x27;cea708&#x27; - &#x27;webvtt&#x27;
&quot;languageCode&quot;: &quot;A String&quot;, # Required. The BCP-47 language code, such as `&quot;en-US&quot;` or `&quot;sr-Latn&quot;`. For more information, see https://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
&quot;mapping&quot;: [ # The mapping for the `Job.edit_list` atoms with text `EditAtom.inputs`.
{ # The mapping for the `Job.edit_list` atoms with text `EditAtom.inputs`.
&quot;inputs&quot;: [ # List of `Job.inputs` that should be embedded in this atom. Only one input is supported.
{ # Identifies which input file and track should be used.
&quot;key&quot;: &quot;A String&quot;, # Required. The `Input.key` that identifies the input file.
&quot;track&quot;: 42, # Required. The zero-based index of the track in the input file.
},
],
&quot;key&quot;: &quot;A String&quot;, # Required. The `EditAtom.key` that references atom with text inputs in the `Job.edit_list`.
},
],
},
&quot;videoStream&quot;: { # Video stream resource. # Encoding of a video stream.
&quot;allowOpenGop&quot;: True or False, # Specifies whether an open Group of Pictures (GOP) structure should be allowed or not. The default is `false`.
&quot;aqStrength&quot;: 3.14, # Specify the intensity of the adaptive quantizer (AQ). Must be between 0 and 1, where 0 disables the quantizer and 1 maximizes the quantizer. A higher value equals a lower bitrate but smoother image. The default is 0.
&quot;bFrameCount&quot;: 42, # The number of consecutive B-frames. Must be greater than or equal to zero. Must be less than `VideoStream.gop_frame_count` if set. The default is 0.
&quot;bPyramid&quot;: True or False, # Allow B-pyramid for reference frame selection. This may not be supported on all decoders. The default is `false`.
&quot;bitrateBps&quot;: 42, # Required. The video bitrate in bits per second. The minimum value is 1,000. The maximum value for H264/H265 is 800,000,000. The maximum value for VP9 is 480,000,000.
&quot;codec&quot;: &quot;A String&quot;, # Codec type. The following codecs are supported: * `h264` (default) * `h265` * `vp9`
&quot;crfLevel&quot;: 42, # Target CRF level. Must be between 10 and 36, where 10 is the highest quality and 36 is the most efficient compression. The default is 21.
&quot;enableTwoPass&quot;: True or False, # Use two-pass encoding strategy to achieve better video quality. `VideoStream.rate_control_mode` must be `&quot;vbr&quot;`. The default is `false`.
&quot;entropyCoder&quot;: &quot;A String&quot;, # The entropy coder to use. The default is `&quot;cabac&quot;`. Supported entropy coders: - &#x27;cavlc&#x27; - &#x27;cabac&#x27;
&quot;frameRate&quot;: 3.14, # Required. The target video frame rate in frames per second (FPS). Must be less than or equal to 120. Will default to the input frame rate if larger than the input frame rate. The API will generate an output FPS that is divisible by the input FPS, and smaller or equal to the target FPS. See [Calculate frame rate](https://cloud.google.com/transcoder/docs/concepts/frame-rate) for more information.
&quot;gopDuration&quot;: &quot;A String&quot;, # Select the GOP size based on the specified duration. The default is `&quot;3s&quot;`. Note that `gopDuration` must be less than or equal to [`segmentDuration`](#SegmentSettings), and [`segmentDuration`](#SegmentSettings) must be divisible by `gopDuration`.
&quot;gopFrameCount&quot;: 42, # Select the GOP size based on the specified frame count. Must be greater than zero.
&quot;heightPixels&quot;: 42, # The height of the video in pixels. Must be an even integer. When not specified, the height is adjusted to match the specified width and input aspect ratio. If both are omitted, the input height is used.
&quot;pixelFormat&quot;: &quot;A String&quot;, # Pixel format to use. The default is `&quot;yuv420p&quot;`. Supported pixel formats: - &#x27;yuv420p&#x27; pixel format. - &#x27;yuv422p&#x27; pixel format. - &#x27;yuv444p&#x27; pixel format. - &#x27;yuv420p10&#x27; 10-bit HDR pixel format. - &#x27;yuv422p10&#x27; 10-bit HDR pixel format. - &#x27;yuv444p10&#x27; 10-bit HDR pixel format. - &#x27;yuv420p12&#x27; 12-bit HDR pixel format. - &#x27;yuv422p12&#x27; 12-bit HDR pixel format. - &#x27;yuv444p12&#x27; 12-bit HDR pixel format.
&quot;preset&quot;: &quot;A String&quot;, # Enforces the specified codec preset. The default is `veryfast`. The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;profile&quot;: &quot;A String&quot;, # Enforces the specified codec profile. The following profiles are supported: * `baseline` * `main` * `high` (default) The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;rateControlMode&quot;: &quot;A String&quot;, # Specify the `rate_control_mode`. The default is `&quot;vbr&quot;`. Supported rate control modes: - &#x27;vbr&#x27; - variable bitrate - &#x27;crf&#x27; - constant rate factor
&quot;tune&quot;: &quot;A String&quot;, # Enforces the specified codec tune. The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;vbvFullnessBits&quot;: 42, # Initial fullness of the Video Buffering Verifier (VBV) buffer in bits. Must be greater than zero. The default is equal to 90% of `VideoStream.vbv_size_bits`.
&quot;vbvSizeBits&quot;: 42, # Size of the Video Buffering Verifier (VBV) buffer in bits. Must be greater than zero. The default is equal to `VideoStream.bitrate_bps`.
&quot;widthPixels&quot;: 42, # The width of the video in pixels. Must be an even integer. When not specified, the width is adjusted to match the specified height and input aspect ratio. If both are omitted, the input width is used.
},
},
],
&quot;inputs&quot;: [ # List of input assets stored in Cloud Storage.
{ # Input asset.
&quot;key&quot;: &quot;A String&quot;, # A unique key for this input. Must be specified when using advanced mapping and edit lists.
&quot;preprocessingConfig&quot;: { # Preprocessing configurations. # Preprocessing configurations.
&quot;audio&quot;: { # Audio preprocessing configuration. # Audio preprocessing configuration.
&quot;highBoost&quot;: True or False, # Enable boosting high frequency components. The default is `false`.
&quot;lowBoost&quot;: True or False, # Enable boosting low frequency components. The default is `false`.
&quot;lufs&quot;: 3.14, # Specify audio loudness normalization in loudness units relative to full scale (LUFS). Enter a value between -24 and 0 (the default), where: * -24 is the Advanced Television Systems Committee (ATSC A/85) standard * -23 is the EU R128 broadcast standard * -19 is the prior standard for online mono audio * -18 is the ReplayGain standard * -16 is the prior standard for stereo audio * -14 is the new online audio standard recommended by Spotify, as well as Amazon Echo * 0 disables normalization
},
&quot;color&quot;: { # Color preprocessing configuration. # Color preprocessing configuration.
&quot;brightness&quot;: 3.14, # Control brightness of the video. Enter a value between -1 and 1, where -1 is minimum brightness and 1 is maximum brightness. 0 is no change. The default is 0.
&quot;contrast&quot;: 3.14, # Control black and white contrast of the video. Enter a value between -1 and 1, where -1 is minimum contrast and 1 is maximum contrast. 0 is no change. The default is 0.
&quot;saturation&quot;: 3.14, # Control color saturation of the video. Enter a value between -1 and 1, where -1 is fully desaturated and 1 is maximum saturation. 0 is no change. The default is 0.
},
&quot;crop&quot;: { # Video cropping configuration for the input video. The cropped input video is scaled to match the output resolution. # Specify the video cropping configuration.
&quot;bottomPixels&quot;: 42, # The number of pixels to crop from the bottom. The default is 0.
&quot;leftPixels&quot;: 42, # The number of pixels to crop from the left. The default is 0.
&quot;rightPixels&quot;: 42, # The number of pixels to crop from the right. The default is 0.
&quot;topPixels&quot;: 42, # The number of pixels to crop from the top. The default is 0.
},
&quot;deblock&quot;: { # Deblock preprocessing configuration. # Deblock preprocessing configuration.
&quot;enabled&quot;: True or False, # Enable deblocker. The default is `false`.
&quot;strength&quot;: 3.14, # Set strength of the deblocker. Enter a value between 0 and 1. The higher the value, the stronger the block removal. 0 is no deblocking. The default is 0.
},
&quot;denoise&quot;: { # Denoise preprocessing configuration. # Denoise preprocessing configuration.
&quot;strength&quot;: 3.14, # Set strength of the denoise. Enter a value between 0 and 1. The higher the value, the smoother the image. 0 is no denoising. The default is 0.
&quot;tune&quot;: &quot;A String&quot;, # Set the denoiser mode. The default is `&quot;standard&quot;`. Supported denoiser modes: - &#x27;standard&#x27; - &#x27;grain&#x27;
},
&quot;pad&quot;: { # Pad filter configuration for the input video. The padded input video is scaled after padding with black to match the output resolution. # Specify the video pad filter configuration.
&quot;bottomPixels&quot;: 42, # The number of pixels to add to the bottom. The default is 0.
&quot;leftPixels&quot;: 42, # The number of pixels to add to the left. The default is 0.
&quot;rightPixels&quot;: 42, # The number of pixels to add to the right. The default is 0.
&quot;topPixels&quot;: 42, # The number of pixels to add to the top. The default is 0.
},
},
&quot;uri&quot;: &quot;A String&quot;, # URI of the media. Input files must be at least 5 seconds in duration and stored in Cloud Storage (for example, `gs://bucket/inputs/file.mp4`). If empty, the value will be populated from `Job.input_uri`.
},
],
&quot;manifests&quot;: [ # List of output manifests.
{ # Manifest configuration.
&quot;fileName&quot;: &quot;A String&quot;, # The name of the generated file. The default is `&quot;manifest&quot;` with the extension suffix corresponding to the `Manifest.type`.
&quot;muxStreams&quot;: [ # Required. List of user given `MuxStream.key`s that should appear in this manifest. When `Manifest.type` is `HLS`, a media manifest with name `MuxStream.key` and `.m3u8` extension is generated for each element of the `Manifest.mux_streams`.
&quot;A String&quot;,
],
&quot;type&quot;: &quot;A String&quot;, # Required. Type of the manifest, can be &quot;HLS&quot; or &quot;DASH&quot;.
},
],
&quot;muxStreams&quot;: [ # List of multiplexing settings for output streams.
{ # Multiplexing settings for output stream.
&quot;container&quot;: &quot;A String&quot;, # The container format. The default is `&quot;mp4&quot;` Supported container formats: - &#x27;ts&#x27; - &#x27;fmp4&#x27;- the corresponding file extension is `&quot;.m4s&quot;` - &#x27;mp4&#x27; - &#x27;vtt&#x27;
&quot;elementaryStreams&quot;: [ # List of `ElementaryStream.key`s multiplexed in this stream.
&quot;A String&quot;,
],
&quot;encryption&quot;: { # Encryption settings. # Encryption settings.
&quot;aes128&quot;: { # Configuration for AES-128 encryption. # Configuration for AES-128 encryption.
&quot;keyUri&quot;: &quot;A String&quot;, # Required. URI of the key delivery service. This URI is inserted into the M3U8 header.
},
&quot;iv&quot;: &quot;A String&quot;, # Required. 128 bit Initialization Vector (IV) represented as lowercase hexadecimal digits.
&quot;key&quot;: &quot;A String&quot;, # Required. 128 bit encryption key represented as lowercase hexadecimal digits.
&quot;mpegCenc&quot;: { # Configuration for MPEG Common Encryption (MPEG-CENC). # Configuration for MPEG Common Encryption (MPEG-CENC).
&quot;keyId&quot;: &quot;A String&quot;, # Required. 128 bit Key ID represented as lowercase hexadecimal digits for use with common encryption.
&quot;scheme&quot;: &quot;A String&quot;, # Required. Specify the encryption scheme. Supported encryption schemes: - &#x27;cenc&#x27; - &#x27;cbcs&#x27;
},
&quot;sampleAes&quot;: { # Configuration for SAMPLE-AES encryption. # Configuration for SAMPLE-AES encryption.
&quot;keyUri&quot;: &quot;A String&quot;, # Required. URI of the key delivery service. This URI is inserted into the M3U8 header.
},
},
&quot;fileName&quot;: &quot;A String&quot;, # The name of the generated file. The default is `MuxStream.key` with the extension suffix corresponding to the `MuxStream.container`. Individual segments also have an incremental 10-digit zero-padded suffix starting from 0 before the extension, such as `&quot;mux_stream0000000123.ts&quot;`.
&quot;key&quot;: &quot;A String&quot;, # A unique key for this multiplexed stream. HLS media manifests will be named `MuxStream.key` with the `&quot;.m3u8&quot;` extension suffix.
&quot;segmentSettings&quot;: { # Segment settings for `&quot;ts&quot;`, `&quot;fmp4&quot;` and `&quot;vtt&quot;`. # Segment settings for `&quot;ts&quot;`, `&quot;fmp4&quot;` and `&quot;vtt&quot;`.
&quot;individualSegments&quot;: True or False, # Required. Create an individual segment file. The default is `false`.
&quot;segmentDuration&quot;: &quot;A String&quot;, # Duration of the segments in seconds. The default is `&quot;6.0s&quot;`. Note that `segmentDuration` must be greater than or equal to [`gopDuration`](#videostream), and `segmentDuration` must be divisible by [`gopDuration`](#videostream).
},
},
],
&quot;output&quot;: { # Location of output file(s) in a Cloud Storage bucket. # Output configuration.
&quot;uri&quot;: &quot;A String&quot;, # URI for the output file(s). For example, `gs://my-bucket/outputs/`. If empty the value is populated from `Job.output_uri`.
},
&quot;overlays&quot;: [ # List of overlays on the output video, in descending Z-order.
{ # Overlay configuration.
&quot;animations&quot;: [ # List of Animations. The list should be chronological, without any time overlap.
{ # Animation types.
&quot;animationEnd&quot;: { # End previous overlay animation from the video. Without AnimationEnd, the overlay object will keep the state of previous animation until the end of the video. # End previous animation.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to end overlay object, in seconds. Default: 0
},
&quot;animationFade&quot;: { # Display overlay object with fade animation. # Display overlay object with fade animation.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # The time to end the fade animation, in seconds. Default: `start_time_offset` + 1s
&quot;fadeType&quot;: &quot;A String&quot;, # Required. Type of fade animation: `FADE_IN` or `FADE_OUT`.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to start the fade animation, in seconds. Default: 0
&quot;xy&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized coordinates based on output video resolution. Valid values: `0.0`–`1.0`. `xy` is the upper-left coordinate of the overlay object. For example, use the x and y coordinates {0,0} to position the top-left corner of the overlay animation in the top-left corner of the output video.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
},
&quot;animationStatic&quot;: { # Display static overlay object. # Display static overlay object.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to start displaying the overlay object, in seconds. Default: 0
&quot;xy&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized coordinates based on output video resolution. Valid values: `0.0`–`1.0`. `xy` is the upper-left coordinate of the overlay object. For example, use the x and y coordinates {0,0} to position the top-left corner of the overlay animation in the top-left corner of the output video.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
},
},
],
&quot;image&quot;: { # Overlaid jpeg image. # Image overlay.
&quot;alpha&quot;: 3.14, # Target image opacity. Valid values are from `1.0` (solid, default) to `0.0` (transparent), exclusive. Set this to a value greater than `0.0`.
&quot;resolution&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized image resolution, based on output video resolution. Valid values: `0.0`–`1.0`. To respect the original image aspect ratio, set either `x` or `y` to `0.0`. To use the original image resolution, set both `x` and `y` to `0.0`.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
&quot;uri&quot;: &quot;A String&quot;, # Required. URI of the JPEG image in Cloud Storage. For example, `gs://bucket/inputs/image.jpeg`. JPEG is the only supported image type.
},
},
],
&quot;pubsubDestination&quot;: { # A Pub/Sub destination. # Destination on Pub/Sub.
&quot;topic&quot;: &quot;A String&quot;, # The name of the Pub/Sub topic to publish job completion notification to. For example: `projects/{project}/topics/{topic}`.
},
&quot;spriteSheets&quot;: [ # List of output sprite sheets.
{ # Sprite sheet configuration.
&quot;columnCount&quot;: 42, # The maximum number of sprites per row in a sprite sheet. The default is 0, which indicates no maximum limit.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # End time in seconds, relative to the output file timeline. When `end_time_offset` is not specified, the sprites are generated until the end of the output file.
&quot;filePrefix&quot;: &quot;A String&quot;, # Required. File name prefix for the generated sprite sheets. Each sprite sheet has an incremental 10-digit zero-padded suffix starting from 0 before the extension, such as `&quot;sprite_sheet0000000123.jpeg&quot;`.
&quot;format&quot;: &quot;A String&quot;, # Format type. The default is `&quot;jpeg&quot;`. Supported formats: - &#x27;jpeg&#x27;
&quot;interval&quot;: &quot;A String&quot;, # Starting from `0s`, create sprites at regular intervals. Specify the interval value in seconds.
&quot;quality&quot;: 42, # The quality of the generated sprite sheet. Enter a value between 1 and 100, where 1 is the lowest quality and 100 is the highest quality. The default is 100. A high quality value corresponds to a low image data compression ratio.
&quot;rowCount&quot;: 42, # The maximum number of rows per sprite sheet. When the sprite sheet is full, a new sprite sheet is created. The default is 0, which indicates no maximum limit.
&quot;spriteHeightPixels&quot;: 42, # Required. The height of sprite in pixels. Must be an even integer. To preserve the source aspect ratio, set the SpriteSheet.sprite_height_pixels field or the SpriteSheet.sprite_width_pixels field, but not both (the API will automatically calculate the missing field).
&quot;spriteWidthPixels&quot;: 42, # Required. The width of sprite in pixels. Must be an even integer. To preserve the source aspect ratio, set the SpriteSheet.sprite_width_pixels field or the SpriteSheet.sprite_height_pixels field, but not both (the API will automatically calculate the missing field).
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds, relative to the output file timeline. Determines the first sprite to pick. The default is `0s`.
&quot;totalCount&quot;: 42, # Total number of sprites. Create the specified number of sprites distributed evenly across the timeline of the output media. The default is 100.
},
],
},
&quot;name&quot;: &quot;A String&quot;, # The resource name of the job template. Format: `projects/{project}/locations/{location}/jobTemplates/{job_template}`
}</pre>
</div>
<div class="method">
<code class="details" id="delete">delete(name, x__xgafv=None)</code>
<pre>Deletes a job template.
Args:
name: string, Required. The name of the job template to delete. `projects/{project}/locations/{location}/jobTemplates/{job_template}` (required)
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # A generic empty message that you can re-use to avoid defining duplicated empty messages in your APIs. A typical example is to use it as the request or the response type of an API method. For instance: service Foo { rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty); } The JSON representation for `Empty` is empty JSON object `{}`.
}</pre>
</div>
<div class="method">
<code class="details" id="get">get(name, x__xgafv=None)</code>
<pre>Returns the job template data.
Args:
name: string, Required. The name of the job template to retrieve. Format: `projects/{project}/locations/{location}/jobTemplates/{job_template}` (required)
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # Transcoding job template resource.
&quot;config&quot;: { # Job configuration # The configuration for this template.
&quot;adBreaks&quot;: [ # List of ad breaks. Specifies where to insert ad break tags in the output manifests.
{ # Ad break.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds for the ad break, relative to the output file timeline. The default is `0s`.
},
],
&quot;editList&quot;: [ # List of `Edit atom`s. Defines the ultimate timeline of the resulting file or manifest.
{ # Edit atom.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # End time in seconds for the atom, relative to the input file timeline. When `end_time_offset` is not specified, the `inputs` are used until the end of the atom.
&quot;inputs&quot;: [ # List of `Input.key`s identifying files that should be used in this atom. The listed `inputs` must have the same timeline.
&quot;A String&quot;,
],
&quot;key&quot;: &quot;A String&quot;, # A unique key for this atom. Must be specified when using advanced mapping.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds for the atom, relative to the input file timeline. The default is `0s`.
},
],
&quot;elementaryStreams&quot;: [ # List of elementary streams.
{ # Encoding of an input file such as an audio, video, or text track. Elementary streams must be packaged before mapping and sharing between different output formats.
&quot;audioStream&quot;: { # Audio stream resource. # Encoding of an audio stream.
&quot;bitrateBps&quot;: 42, # Required. Audio bitrate in bits per second. Must be between 1 and 10,000,000.
&quot;channelCount&quot;: 42, # Number of audio channels. Must be between 1 and 6. The default is 2.
&quot;channelLayout&quot;: [ # A list of channel names specifying layout of the audio channels. This only affects the metadata embedded in the container headers, if supported by the specified format. The default is `[&quot;fl&quot;, &quot;fr&quot;]`. Supported channel names: - &#x27;fl&#x27; - Front left channel - &#x27;fr&#x27; - Front right channel - &#x27;sl&#x27; - Side left channel - &#x27;sr&#x27; - Side right channel - &#x27;fc&#x27; - Front center channel - &#x27;lfe&#x27; - Low frequency
&quot;A String&quot;,
],
&quot;codec&quot;: &quot;A String&quot;, # The codec for this audio stream. The default is `&quot;aac&quot;`. Supported audio codecs: - &#x27;aac&#x27; - &#x27;aac-he&#x27; - &#x27;aac-he-v2&#x27; - &#x27;mp3&#x27; - &#x27;ac3&#x27; - &#x27;eac3&#x27;
&quot;mapping&quot;: [ # The mapping for the `Job.edit_list` atoms with audio `EditAtom.inputs`.
{ # The mapping for the `Job.edit_list` atoms with audio `EditAtom.inputs`.
&quot;channels&quot;: [ # List of `Channel`s for this audio stream. for in-depth explanation.
{ # The audio channel.
&quot;inputs&quot;: [ # List of `Job.inputs` for this audio channel.
{ # Identifies which input file, track, and channel should be used.
&quot;channel&quot;: 42, # Required. The zero-based index of the channel in the input file.
&quot;gainDb&quot;: 3.14, # Audio volume control in dB. Negative values decrease volume, positive values increase. The default is 0.
&quot;key&quot;: &quot;A String&quot;, # Required. The `Input.key` that identifies the input file.
&quot;track&quot;: 42, # Required. The zero-based index of the track in the input file.
},
],
},
],
&quot;key&quot;: &quot;A String&quot;, # Required. The `EditAtom.key` that references the atom with audio inputs in the `Job.edit_list`.
},
],
&quot;sampleRateHertz&quot;: 42, # The audio sample rate in Hertz. The default is 48000 Hertz.
},
&quot;key&quot;: &quot;A String&quot;, # A unique key for this elementary stream.
&quot;textStream&quot;: { # Encoding of a text stream. For example, closed captions or subtitles. # Encoding of a text stream. For example, closed captions or subtitles.
&quot;codec&quot;: &quot;A String&quot;, # The codec for this text stream. The default is `&quot;webvtt&quot;`. Supported text codecs: - &#x27;srt&#x27; - &#x27;ttml&#x27; - &#x27;cea608&#x27; - &#x27;cea708&#x27; - &#x27;webvtt&#x27;
&quot;languageCode&quot;: &quot;A String&quot;, # Required. The BCP-47 language code, such as `&quot;en-US&quot;` or `&quot;sr-Latn&quot;`. For more information, see https://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
&quot;mapping&quot;: [ # The mapping for the `Job.edit_list` atoms with text `EditAtom.inputs`.
{ # The mapping for the `Job.edit_list` atoms with text `EditAtom.inputs`.
&quot;inputs&quot;: [ # List of `Job.inputs` that should be embedded in this atom. Only one input is supported.
{ # Identifies which input file and track should be used.
&quot;key&quot;: &quot;A String&quot;, # Required. The `Input.key` that identifies the input file.
&quot;track&quot;: 42, # Required. The zero-based index of the track in the input file.
},
],
&quot;key&quot;: &quot;A String&quot;, # Required. The `EditAtom.key` that references atom with text inputs in the `Job.edit_list`.
},
],
},
&quot;videoStream&quot;: { # Video stream resource. # Encoding of a video stream.
&quot;allowOpenGop&quot;: True or False, # Specifies whether an open Group of Pictures (GOP) structure should be allowed or not. The default is `false`.
&quot;aqStrength&quot;: 3.14, # Specify the intensity of the adaptive quantizer (AQ). Must be between 0 and 1, where 0 disables the quantizer and 1 maximizes the quantizer. A higher value equals a lower bitrate but smoother image. The default is 0.
&quot;bFrameCount&quot;: 42, # The number of consecutive B-frames. Must be greater than or equal to zero. Must be less than `VideoStream.gop_frame_count` if set. The default is 0.
&quot;bPyramid&quot;: True or False, # Allow B-pyramid for reference frame selection. This may not be supported on all decoders. The default is `false`.
&quot;bitrateBps&quot;: 42, # Required. The video bitrate in bits per second. The minimum value is 1,000. The maximum value for H264/H265 is 800,000,000. The maximum value for VP9 is 480,000,000.
&quot;codec&quot;: &quot;A String&quot;, # Codec type. The following codecs are supported: * `h264` (default) * `h265` * `vp9`
&quot;crfLevel&quot;: 42, # Target CRF level. Must be between 10 and 36, where 10 is the highest quality and 36 is the most efficient compression. The default is 21.
&quot;enableTwoPass&quot;: True or False, # Use two-pass encoding strategy to achieve better video quality. `VideoStream.rate_control_mode` must be `&quot;vbr&quot;`. The default is `false`.
&quot;entropyCoder&quot;: &quot;A String&quot;, # The entropy coder to use. The default is `&quot;cabac&quot;`. Supported entropy coders: - &#x27;cavlc&#x27; - &#x27;cabac&#x27;
&quot;frameRate&quot;: 3.14, # Required. The target video frame rate in frames per second (FPS). Must be less than or equal to 120. Will default to the input frame rate if larger than the input frame rate. The API will generate an output FPS that is divisible by the input FPS, and smaller or equal to the target FPS. See [Calculate frame rate](https://cloud.google.com/transcoder/docs/concepts/frame-rate) for more information.
&quot;gopDuration&quot;: &quot;A String&quot;, # Select the GOP size based on the specified duration. The default is `&quot;3s&quot;`. Note that `gopDuration` must be less than or equal to [`segmentDuration`](#SegmentSettings), and [`segmentDuration`](#SegmentSettings) must be divisible by `gopDuration`.
&quot;gopFrameCount&quot;: 42, # Select the GOP size based on the specified frame count. Must be greater than zero.
&quot;heightPixels&quot;: 42, # The height of the video in pixels. Must be an even integer. When not specified, the height is adjusted to match the specified width and input aspect ratio. If both are omitted, the input height is used.
&quot;pixelFormat&quot;: &quot;A String&quot;, # Pixel format to use. The default is `&quot;yuv420p&quot;`. Supported pixel formats: - &#x27;yuv420p&#x27; pixel format. - &#x27;yuv422p&#x27; pixel format. - &#x27;yuv444p&#x27; pixel format. - &#x27;yuv420p10&#x27; 10-bit HDR pixel format. - &#x27;yuv422p10&#x27; 10-bit HDR pixel format. - &#x27;yuv444p10&#x27; 10-bit HDR pixel format. - &#x27;yuv420p12&#x27; 12-bit HDR pixel format. - &#x27;yuv422p12&#x27; 12-bit HDR pixel format. - &#x27;yuv444p12&#x27; 12-bit HDR pixel format.
&quot;preset&quot;: &quot;A String&quot;, # Enforces the specified codec preset. The default is `veryfast`. The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;profile&quot;: &quot;A String&quot;, # Enforces the specified codec profile. The following profiles are supported: * `baseline` * `main` * `high` (default) The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;rateControlMode&quot;: &quot;A String&quot;, # Specify the `rate_control_mode`. The default is `&quot;vbr&quot;`. Supported rate control modes: - &#x27;vbr&#x27; - variable bitrate - &#x27;crf&#x27; - constant rate factor
&quot;tune&quot;: &quot;A String&quot;, # Enforces the specified codec tune. The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;vbvFullnessBits&quot;: 42, # Initial fullness of the Video Buffering Verifier (VBV) buffer in bits. Must be greater than zero. The default is equal to 90% of `VideoStream.vbv_size_bits`.
&quot;vbvSizeBits&quot;: 42, # Size of the Video Buffering Verifier (VBV) buffer in bits. Must be greater than zero. The default is equal to `VideoStream.bitrate_bps`.
&quot;widthPixels&quot;: 42, # The width of the video in pixels. Must be an even integer. When not specified, the width is adjusted to match the specified height and input aspect ratio. If both are omitted, the input width is used.
},
},
],
&quot;inputs&quot;: [ # List of input assets stored in Cloud Storage.
{ # Input asset.
&quot;key&quot;: &quot;A String&quot;, # A unique key for this input. Must be specified when using advanced mapping and edit lists.
&quot;preprocessingConfig&quot;: { # Preprocessing configurations. # Preprocessing configurations.
&quot;audio&quot;: { # Audio preprocessing configuration. # Audio preprocessing configuration.
&quot;highBoost&quot;: True or False, # Enable boosting high frequency components. The default is `false`.
&quot;lowBoost&quot;: True or False, # Enable boosting low frequency components. The default is `false`.
&quot;lufs&quot;: 3.14, # Specify audio loudness normalization in loudness units relative to full scale (LUFS). Enter a value between -24 and 0 (the default), where: * -24 is the Advanced Television Systems Committee (ATSC A/85) standard * -23 is the EU R128 broadcast standard * -19 is the prior standard for online mono audio * -18 is the ReplayGain standard * -16 is the prior standard for stereo audio * -14 is the new online audio standard recommended by Spotify, as well as Amazon Echo * 0 disables normalization
},
&quot;color&quot;: { # Color preprocessing configuration. # Color preprocessing configuration.
&quot;brightness&quot;: 3.14, # Control brightness of the video. Enter a value between -1 and 1, where -1 is minimum brightness and 1 is maximum brightness. 0 is no change. The default is 0.
&quot;contrast&quot;: 3.14, # Control black and white contrast of the video. Enter a value between -1 and 1, where -1 is minimum contrast and 1 is maximum contrast. 0 is no change. The default is 0.
&quot;saturation&quot;: 3.14, # Control color saturation of the video. Enter a value between -1 and 1, where -1 is fully desaturated and 1 is maximum saturation. 0 is no change. The default is 0.
},
&quot;crop&quot;: { # Video cropping configuration for the input video. The cropped input video is scaled to match the output resolution. # Specify the video cropping configuration.
&quot;bottomPixels&quot;: 42, # The number of pixels to crop from the bottom. The default is 0.
&quot;leftPixels&quot;: 42, # The number of pixels to crop from the left. The default is 0.
&quot;rightPixels&quot;: 42, # The number of pixels to crop from the right. The default is 0.
&quot;topPixels&quot;: 42, # The number of pixels to crop from the top. The default is 0.
},
&quot;deblock&quot;: { # Deblock preprocessing configuration. # Deblock preprocessing configuration.
&quot;enabled&quot;: True or False, # Enable deblocker. The default is `false`.
&quot;strength&quot;: 3.14, # Set strength of the deblocker. Enter a value between 0 and 1. The higher the value, the stronger the block removal. 0 is no deblocking. The default is 0.
},
&quot;denoise&quot;: { # Denoise preprocessing configuration. # Denoise preprocessing configuration.
&quot;strength&quot;: 3.14, # Set strength of the denoise. Enter a value between 0 and 1. The higher the value, the smoother the image. 0 is no denoising. The default is 0.
&quot;tune&quot;: &quot;A String&quot;, # Set the denoiser mode. The default is `&quot;standard&quot;`. Supported denoiser modes: - &#x27;standard&#x27; - &#x27;grain&#x27;
},
&quot;pad&quot;: { # Pad filter configuration for the input video. The padded input video is scaled after padding with black to match the output resolution. # Specify the video pad filter configuration.
&quot;bottomPixels&quot;: 42, # The number of pixels to add to the bottom. The default is 0.
&quot;leftPixels&quot;: 42, # The number of pixels to add to the left. The default is 0.
&quot;rightPixels&quot;: 42, # The number of pixels to add to the right. The default is 0.
&quot;topPixels&quot;: 42, # The number of pixels to add to the top. The default is 0.
},
},
&quot;uri&quot;: &quot;A String&quot;, # URI of the media. Input files must be at least 5 seconds in duration and stored in Cloud Storage (for example, `gs://bucket/inputs/file.mp4`). If empty, the value will be populated from `Job.input_uri`.
},
],
&quot;manifests&quot;: [ # List of output manifests.
{ # Manifest configuration.
&quot;fileName&quot;: &quot;A String&quot;, # The name of the generated file. The default is `&quot;manifest&quot;` with the extension suffix corresponding to the `Manifest.type`.
&quot;muxStreams&quot;: [ # Required. List of user given `MuxStream.key`s that should appear in this manifest. When `Manifest.type` is `HLS`, a media manifest with name `MuxStream.key` and `.m3u8` extension is generated for each element of the `Manifest.mux_streams`.
&quot;A String&quot;,
],
&quot;type&quot;: &quot;A String&quot;, # Required. Type of the manifest, can be &quot;HLS&quot; or &quot;DASH&quot;.
},
],
&quot;muxStreams&quot;: [ # List of multiplexing settings for output streams.
{ # Multiplexing settings for output stream.
&quot;container&quot;: &quot;A String&quot;, # The container format. The default is `&quot;mp4&quot;` Supported container formats: - &#x27;ts&#x27; - &#x27;fmp4&#x27;- the corresponding file extension is `&quot;.m4s&quot;` - &#x27;mp4&#x27; - &#x27;vtt&#x27;
&quot;elementaryStreams&quot;: [ # List of `ElementaryStream.key`s multiplexed in this stream.
&quot;A String&quot;,
],
&quot;encryption&quot;: { # Encryption settings. # Encryption settings.
&quot;aes128&quot;: { # Configuration for AES-128 encryption. # Configuration for AES-128 encryption.
&quot;keyUri&quot;: &quot;A String&quot;, # Required. URI of the key delivery service. This URI is inserted into the M3U8 header.
},
&quot;iv&quot;: &quot;A String&quot;, # Required. 128 bit Initialization Vector (IV) represented as lowercase hexadecimal digits.
&quot;key&quot;: &quot;A String&quot;, # Required. 128 bit encryption key represented as lowercase hexadecimal digits.
&quot;mpegCenc&quot;: { # Configuration for MPEG Common Encryption (MPEG-CENC). # Configuration for MPEG Common Encryption (MPEG-CENC).
&quot;keyId&quot;: &quot;A String&quot;, # Required. 128 bit Key ID represented as lowercase hexadecimal digits for use with common encryption.
&quot;scheme&quot;: &quot;A String&quot;, # Required. Specify the encryption scheme. Supported encryption schemes: - &#x27;cenc&#x27; - &#x27;cbcs&#x27;
},
&quot;sampleAes&quot;: { # Configuration for SAMPLE-AES encryption. # Configuration for SAMPLE-AES encryption.
&quot;keyUri&quot;: &quot;A String&quot;, # Required. URI of the key delivery service. This URI is inserted into the M3U8 header.
},
},
&quot;fileName&quot;: &quot;A String&quot;, # The name of the generated file. The default is `MuxStream.key` with the extension suffix corresponding to the `MuxStream.container`. Individual segments also have an incremental 10-digit zero-padded suffix starting from 0 before the extension, such as `&quot;mux_stream0000000123.ts&quot;`.
&quot;key&quot;: &quot;A String&quot;, # A unique key for this multiplexed stream. HLS media manifests will be named `MuxStream.key` with the `&quot;.m3u8&quot;` extension suffix.
&quot;segmentSettings&quot;: { # Segment settings for `&quot;ts&quot;`, `&quot;fmp4&quot;` and `&quot;vtt&quot;`. # Segment settings for `&quot;ts&quot;`, `&quot;fmp4&quot;` and `&quot;vtt&quot;`.
&quot;individualSegments&quot;: True or False, # Required. Create an individual segment file. The default is `false`.
&quot;segmentDuration&quot;: &quot;A String&quot;, # Duration of the segments in seconds. The default is `&quot;6.0s&quot;`. Note that `segmentDuration` must be greater than or equal to [`gopDuration`](#videostream), and `segmentDuration` must be divisible by [`gopDuration`](#videostream).
},
},
],
&quot;output&quot;: { # Location of output file(s) in a Cloud Storage bucket. # Output configuration.
&quot;uri&quot;: &quot;A String&quot;, # URI for the output file(s). For example, `gs://my-bucket/outputs/`. If empty the value is populated from `Job.output_uri`.
},
&quot;overlays&quot;: [ # List of overlays on the output video, in descending Z-order.
{ # Overlay configuration.
&quot;animations&quot;: [ # List of Animations. The list should be chronological, without any time overlap.
{ # Animation types.
&quot;animationEnd&quot;: { # End previous overlay animation from the video. Without AnimationEnd, the overlay object will keep the state of previous animation until the end of the video. # End previous animation.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to end overlay object, in seconds. Default: 0
},
&quot;animationFade&quot;: { # Display overlay object with fade animation. # Display overlay object with fade animation.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # The time to end the fade animation, in seconds. Default: `start_time_offset` + 1s
&quot;fadeType&quot;: &quot;A String&quot;, # Required. Type of fade animation: `FADE_IN` or `FADE_OUT`.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to start the fade animation, in seconds. Default: 0
&quot;xy&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized coordinates based on output video resolution. Valid values: `0.0`–`1.0`. `xy` is the upper-left coordinate of the overlay object. For example, use the x and y coordinates {0,0} to position the top-left corner of the overlay animation in the top-left corner of the output video.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
},
&quot;animationStatic&quot;: { # Display static overlay object. # Display static overlay object.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to start displaying the overlay object, in seconds. Default: 0
&quot;xy&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized coordinates based on output video resolution. Valid values: `0.0`–`1.0`. `xy` is the upper-left coordinate of the overlay object. For example, use the x and y coordinates {0,0} to position the top-left corner of the overlay animation in the top-left corner of the output video.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
},
},
],
&quot;image&quot;: { # Overlaid jpeg image. # Image overlay.
&quot;alpha&quot;: 3.14, # Target image opacity. Valid values are from `1.0` (solid, default) to `0.0` (transparent), exclusive. Set this to a value greater than `0.0`.
&quot;resolution&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized image resolution, based on output video resolution. Valid values: `0.0`–`1.0`. To respect the original image aspect ratio, set either `x` or `y` to `0.0`. To use the original image resolution, set both `x` and `y` to `0.0`.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
&quot;uri&quot;: &quot;A String&quot;, # Required. URI of the JPEG image in Cloud Storage. For example, `gs://bucket/inputs/image.jpeg`. JPEG is the only supported image type.
},
},
],
&quot;pubsubDestination&quot;: { # A Pub/Sub destination. # Destination on Pub/Sub.
&quot;topic&quot;: &quot;A String&quot;, # The name of the Pub/Sub topic to publish job completion notification to. For example: `projects/{project}/topics/{topic}`.
},
&quot;spriteSheets&quot;: [ # List of output sprite sheets.
{ # Sprite sheet configuration.
&quot;columnCount&quot;: 42, # The maximum number of sprites per row in a sprite sheet. The default is 0, which indicates no maximum limit.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # End time in seconds, relative to the output file timeline. When `end_time_offset` is not specified, the sprites are generated until the end of the output file.
&quot;filePrefix&quot;: &quot;A String&quot;, # Required. File name prefix for the generated sprite sheets. Each sprite sheet has an incremental 10-digit zero-padded suffix starting from 0 before the extension, such as `&quot;sprite_sheet0000000123.jpeg&quot;`.
&quot;format&quot;: &quot;A String&quot;, # Format type. The default is `&quot;jpeg&quot;`. Supported formats: - &#x27;jpeg&#x27;
&quot;interval&quot;: &quot;A String&quot;, # Starting from `0s`, create sprites at regular intervals. Specify the interval value in seconds.
&quot;quality&quot;: 42, # The quality of the generated sprite sheet. Enter a value between 1 and 100, where 1 is the lowest quality and 100 is the highest quality. The default is 100. A high quality value corresponds to a low image data compression ratio.
&quot;rowCount&quot;: 42, # The maximum number of rows per sprite sheet. When the sprite sheet is full, a new sprite sheet is created. The default is 0, which indicates no maximum limit.
&quot;spriteHeightPixels&quot;: 42, # Required. The height of sprite in pixels. Must be an even integer. To preserve the source aspect ratio, set the SpriteSheet.sprite_height_pixels field or the SpriteSheet.sprite_width_pixels field, but not both (the API will automatically calculate the missing field).
&quot;spriteWidthPixels&quot;: 42, # Required. The width of sprite in pixels. Must be an even integer. To preserve the source aspect ratio, set the SpriteSheet.sprite_width_pixels field or the SpriteSheet.sprite_height_pixels field, but not both (the API will automatically calculate the missing field).
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds, relative to the output file timeline. Determines the first sprite to pick. The default is `0s`.
&quot;totalCount&quot;: 42, # Total number of sprites. Create the specified number of sprites distributed evenly across the timeline of the output media. The default is 100.
},
],
},
&quot;name&quot;: &quot;A String&quot;, # The resource name of the job template. Format: `projects/{project}/locations/{location}/jobTemplates/{job_template}`
}</pre>
</div>
<div class="method">
<code class="details" id="list">list(parent, pageSize=None, pageToken=None, x__xgafv=None)</code>
<pre>Lists job templates in the specified region.
Args:
parent: string, Required. The parent location from which to retrieve the collection of job templates. Format: `projects/{project}/locations/{location}` (required)
pageSize: integer, The maximum number of items to return.
pageToken: string, The `next_page_token` value returned from a previous List request, if any.
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # Response message for `TranscoderService.ListJobTemplates`.
&quot;jobTemplates&quot;: [ # List of job templates in the specified region.
{ # Transcoding job template resource.
&quot;config&quot;: { # Job configuration # The configuration for this template.
&quot;adBreaks&quot;: [ # List of ad breaks. Specifies where to insert ad break tags in the output manifests.
{ # Ad break.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds for the ad break, relative to the output file timeline. The default is `0s`.
},
],
&quot;editList&quot;: [ # List of `Edit atom`s. Defines the ultimate timeline of the resulting file or manifest.
{ # Edit atom.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # End time in seconds for the atom, relative to the input file timeline. When `end_time_offset` is not specified, the `inputs` are used until the end of the atom.
&quot;inputs&quot;: [ # List of `Input.key`s identifying files that should be used in this atom. The listed `inputs` must have the same timeline.
&quot;A String&quot;,
],
&quot;key&quot;: &quot;A String&quot;, # A unique key for this atom. Must be specified when using advanced mapping.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds for the atom, relative to the input file timeline. The default is `0s`.
},
],
&quot;elementaryStreams&quot;: [ # List of elementary streams.
{ # Encoding of an input file such as an audio, video, or text track. Elementary streams must be packaged before mapping and sharing between different output formats.
&quot;audioStream&quot;: { # Audio stream resource. # Encoding of an audio stream.
&quot;bitrateBps&quot;: 42, # Required. Audio bitrate in bits per second. Must be between 1 and 10,000,000.
&quot;channelCount&quot;: 42, # Number of audio channels. Must be between 1 and 6. The default is 2.
&quot;channelLayout&quot;: [ # A list of channel names specifying layout of the audio channels. This only affects the metadata embedded in the container headers, if supported by the specified format. The default is `[&quot;fl&quot;, &quot;fr&quot;]`. Supported channel names: - &#x27;fl&#x27; - Front left channel - &#x27;fr&#x27; - Front right channel - &#x27;sl&#x27; - Side left channel - &#x27;sr&#x27; - Side right channel - &#x27;fc&#x27; - Front center channel - &#x27;lfe&#x27; - Low frequency
&quot;A String&quot;,
],
&quot;codec&quot;: &quot;A String&quot;, # The codec for this audio stream. The default is `&quot;aac&quot;`. Supported audio codecs: - &#x27;aac&#x27; - &#x27;aac-he&#x27; - &#x27;aac-he-v2&#x27; - &#x27;mp3&#x27; - &#x27;ac3&#x27; - &#x27;eac3&#x27;
&quot;mapping&quot;: [ # The mapping for the `Job.edit_list` atoms with audio `EditAtom.inputs`.
{ # The mapping for the `Job.edit_list` atoms with audio `EditAtom.inputs`.
&quot;channels&quot;: [ # List of `Channel`s for this audio stream. for in-depth explanation.
{ # The audio channel.
&quot;inputs&quot;: [ # List of `Job.inputs` for this audio channel.
{ # Identifies which input file, track, and channel should be used.
&quot;channel&quot;: 42, # Required. The zero-based index of the channel in the input file.
&quot;gainDb&quot;: 3.14, # Audio volume control in dB. Negative values decrease volume, positive values increase. The default is 0.
&quot;key&quot;: &quot;A String&quot;, # Required. The `Input.key` that identifies the input file.
&quot;track&quot;: 42, # Required. The zero-based index of the track in the input file.
},
],
},
],
&quot;key&quot;: &quot;A String&quot;, # Required. The `EditAtom.key` that references the atom with audio inputs in the `Job.edit_list`.
},
],
&quot;sampleRateHertz&quot;: 42, # The audio sample rate in Hertz. The default is 48000 Hertz.
},
&quot;key&quot;: &quot;A String&quot;, # A unique key for this elementary stream.
&quot;textStream&quot;: { # Encoding of a text stream. For example, closed captions or subtitles. # Encoding of a text stream. For example, closed captions or subtitles.
&quot;codec&quot;: &quot;A String&quot;, # The codec for this text stream. The default is `&quot;webvtt&quot;`. Supported text codecs: - &#x27;srt&#x27; - &#x27;ttml&#x27; - &#x27;cea608&#x27; - &#x27;cea708&#x27; - &#x27;webvtt&#x27;
&quot;languageCode&quot;: &quot;A String&quot;, # Required. The BCP-47 language code, such as `&quot;en-US&quot;` or `&quot;sr-Latn&quot;`. For more information, see https://www.unicode.org/reports/tr35/#Unicode_locale_identifier.
&quot;mapping&quot;: [ # The mapping for the `Job.edit_list` atoms with text `EditAtom.inputs`.
{ # The mapping for the `Job.edit_list` atoms with text `EditAtom.inputs`.
&quot;inputs&quot;: [ # List of `Job.inputs` that should be embedded in this atom. Only one input is supported.
{ # Identifies which input file and track should be used.
&quot;key&quot;: &quot;A String&quot;, # Required. The `Input.key` that identifies the input file.
&quot;track&quot;: 42, # Required. The zero-based index of the track in the input file.
},
],
&quot;key&quot;: &quot;A String&quot;, # Required. The `EditAtom.key` that references atom with text inputs in the `Job.edit_list`.
},
],
},
&quot;videoStream&quot;: { # Video stream resource. # Encoding of a video stream.
&quot;allowOpenGop&quot;: True or False, # Specifies whether an open Group of Pictures (GOP) structure should be allowed or not. The default is `false`.
&quot;aqStrength&quot;: 3.14, # Specify the intensity of the adaptive quantizer (AQ). Must be between 0 and 1, where 0 disables the quantizer and 1 maximizes the quantizer. A higher value equals a lower bitrate but smoother image. The default is 0.
&quot;bFrameCount&quot;: 42, # The number of consecutive B-frames. Must be greater than or equal to zero. Must be less than `VideoStream.gop_frame_count` if set. The default is 0.
&quot;bPyramid&quot;: True or False, # Allow B-pyramid for reference frame selection. This may not be supported on all decoders. The default is `false`.
&quot;bitrateBps&quot;: 42, # Required. The video bitrate in bits per second. The minimum value is 1,000. The maximum value for H264/H265 is 800,000,000. The maximum value for VP9 is 480,000,000.
&quot;codec&quot;: &quot;A String&quot;, # Codec type. The following codecs are supported: * `h264` (default) * `h265` * `vp9`
&quot;crfLevel&quot;: 42, # Target CRF level. Must be between 10 and 36, where 10 is the highest quality and 36 is the most efficient compression. The default is 21.
&quot;enableTwoPass&quot;: True or False, # Use two-pass encoding strategy to achieve better video quality. `VideoStream.rate_control_mode` must be `&quot;vbr&quot;`. The default is `false`.
&quot;entropyCoder&quot;: &quot;A String&quot;, # The entropy coder to use. The default is `&quot;cabac&quot;`. Supported entropy coders: - &#x27;cavlc&#x27; - &#x27;cabac&#x27;
&quot;frameRate&quot;: 3.14, # Required. The target video frame rate in frames per second (FPS). Must be less than or equal to 120. Will default to the input frame rate if larger than the input frame rate. The API will generate an output FPS that is divisible by the input FPS, and smaller or equal to the target FPS. See [Calculate frame rate](https://cloud.google.com/transcoder/docs/concepts/frame-rate) for more information.
&quot;gopDuration&quot;: &quot;A String&quot;, # Select the GOP size based on the specified duration. The default is `&quot;3s&quot;`. Note that `gopDuration` must be less than or equal to [`segmentDuration`](#SegmentSettings), and [`segmentDuration`](#SegmentSettings) must be divisible by `gopDuration`.
&quot;gopFrameCount&quot;: 42, # Select the GOP size based on the specified frame count. Must be greater than zero.
&quot;heightPixels&quot;: 42, # The height of the video in pixels. Must be an even integer. When not specified, the height is adjusted to match the specified width and input aspect ratio. If both are omitted, the input height is used.
&quot;pixelFormat&quot;: &quot;A String&quot;, # Pixel format to use. The default is `&quot;yuv420p&quot;`. Supported pixel formats: - &#x27;yuv420p&#x27; pixel format. - &#x27;yuv422p&#x27; pixel format. - &#x27;yuv444p&#x27; pixel format. - &#x27;yuv420p10&#x27; 10-bit HDR pixel format. - &#x27;yuv422p10&#x27; 10-bit HDR pixel format. - &#x27;yuv444p10&#x27; 10-bit HDR pixel format. - &#x27;yuv420p12&#x27; 12-bit HDR pixel format. - &#x27;yuv422p12&#x27; 12-bit HDR pixel format. - &#x27;yuv444p12&#x27; 12-bit HDR pixel format.
&quot;preset&quot;: &quot;A String&quot;, # Enforces the specified codec preset. The default is `veryfast`. The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;profile&quot;: &quot;A String&quot;, # Enforces the specified codec profile. The following profiles are supported: * `baseline` * `main` * `high` (default) The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;rateControlMode&quot;: &quot;A String&quot;, # Specify the `rate_control_mode`. The default is `&quot;vbr&quot;`. Supported rate control modes: - &#x27;vbr&#x27; - variable bitrate - &#x27;crf&#x27; - constant rate factor
&quot;tune&quot;: &quot;A String&quot;, # Enforces the specified codec tune. The available options are FFmpeg-compatible. Note that certain values for this field may cause the transcoder to override other fields you set in the `VideoStream` message.
&quot;vbvFullnessBits&quot;: 42, # Initial fullness of the Video Buffering Verifier (VBV) buffer in bits. Must be greater than zero. The default is equal to 90% of `VideoStream.vbv_size_bits`.
&quot;vbvSizeBits&quot;: 42, # Size of the Video Buffering Verifier (VBV) buffer in bits. Must be greater than zero. The default is equal to `VideoStream.bitrate_bps`.
&quot;widthPixels&quot;: 42, # The width of the video in pixels. Must be an even integer. When not specified, the width is adjusted to match the specified height and input aspect ratio. If both are omitted, the input width is used.
},
},
],
&quot;inputs&quot;: [ # List of input assets stored in Cloud Storage.
{ # Input asset.
&quot;key&quot;: &quot;A String&quot;, # A unique key for this input. Must be specified when using advanced mapping and edit lists.
&quot;preprocessingConfig&quot;: { # Preprocessing configurations. # Preprocessing configurations.
&quot;audio&quot;: { # Audio preprocessing configuration. # Audio preprocessing configuration.
&quot;highBoost&quot;: True or False, # Enable boosting high frequency components. The default is `false`.
&quot;lowBoost&quot;: True or False, # Enable boosting low frequency components. The default is `false`.
&quot;lufs&quot;: 3.14, # Specify audio loudness normalization in loudness units relative to full scale (LUFS). Enter a value between -24 and 0 (the default), where: * -24 is the Advanced Television Systems Committee (ATSC A/85) standard * -23 is the EU R128 broadcast standard * -19 is the prior standard for online mono audio * -18 is the ReplayGain standard * -16 is the prior standard for stereo audio * -14 is the new online audio standard recommended by Spotify, as well as Amazon Echo * 0 disables normalization
},
&quot;color&quot;: { # Color preprocessing configuration. # Color preprocessing configuration.
&quot;brightness&quot;: 3.14, # Control brightness of the video. Enter a value between -1 and 1, where -1 is minimum brightness and 1 is maximum brightness. 0 is no change. The default is 0.
&quot;contrast&quot;: 3.14, # Control black and white contrast of the video. Enter a value between -1 and 1, where -1 is minimum contrast and 1 is maximum contrast. 0 is no change. The default is 0.
&quot;saturation&quot;: 3.14, # Control color saturation of the video. Enter a value between -1 and 1, where -1 is fully desaturated and 1 is maximum saturation. 0 is no change. The default is 0.
},
&quot;crop&quot;: { # Video cropping configuration for the input video. The cropped input video is scaled to match the output resolution. # Specify the video cropping configuration.
&quot;bottomPixels&quot;: 42, # The number of pixels to crop from the bottom. The default is 0.
&quot;leftPixels&quot;: 42, # The number of pixels to crop from the left. The default is 0.
&quot;rightPixels&quot;: 42, # The number of pixels to crop from the right. The default is 0.
&quot;topPixels&quot;: 42, # The number of pixels to crop from the top. The default is 0.
},
&quot;deblock&quot;: { # Deblock preprocessing configuration. # Deblock preprocessing configuration.
&quot;enabled&quot;: True or False, # Enable deblocker. The default is `false`.
&quot;strength&quot;: 3.14, # Set strength of the deblocker. Enter a value between 0 and 1. The higher the value, the stronger the block removal. 0 is no deblocking. The default is 0.
},
&quot;denoise&quot;: { # Denoise preprocessing configuration. # Denoise preprocessing configuration.
&quot;strength&quot;: 3.14, # Set strength of the denoise. Enter a value between 0 and 1. The higher the value, the smoother the image. 0 is no denoising. The default is 0.
&quot;tune&quot;: &quot;A String&quot;, # Set the denoiser mode. The default is `&quot;standard&quot;`. Supported denoiser modes: - &#x27;standard&#x27; - &#x27;grain&#x27;
},
&quot;pad&quot;: { # Pad filter configuration for the input video. The padded input video is scaled after padding with black to match the output resolution. # Specify the video pad filter configuration.
&quot;bottomPixels&quot;: 42, # The number of pixels to add to the bottom. The default is 0.
&quot;leftPixels&quot;: 42, # The number of pixels to add to the left. The default is 0.
&quot;rightPixels&quot;: 42, # The number of pixels to add to the right. The default is 0.
&quot;topPixels&quot;: 42, # The number of pixels to add to the top. The default is 0.
},
},
&quot;uri&quot;: &quot;A String&quot;, # URI of the media. Input files must be at least 5 seconds in duration and stored in Cloud Storage (for example, `gs://bucket/inputs/file.mp4`). If empty, the value will be populated from `Job.input_uri`.
},
],
&quot;manifests&quot;: [ # List of output manifests.
{ # Manifest configuration.
&quot;fileName&quot;: &quot;A String&quot;, # The name of the generated file. The default is `&quot;manifest&quot;` with the extension suffix corresponding to the `Manifest.type`.
&quot;muxStreams&quot;: [ # Required. List of user given `MuxStream.key`s that should appear in this manifest. When `Manifest.type` is `HLS`, a media manifest with name `MuxStream.key` and `.m3u8` extension is generated for each element of the `Manifest.mux_streams`.
&quot;A String&quot;,
],
&quot;type&quot;: &quot;A String&quot;, # Required. Type of the manifest, can be &quot;HLS&quot; or &quot;DASH&quot;.
},
],
&quot;muxStreams&quot;: [ # List of multiplexing settings for output streams.
{ # Multiplexing settings for output stream.
&quot;container&quot;: &quot;A String&quot;, # The container format. The default is `&quot;mp4&quot;` Supported container formats: - &#x27;ts&#x27; - &#x27;fmp4&#x27;- the corresponding file extension is `&quot;.m4s&quot;` - &#x27;mp4&#x27; - &#x27;vtt&#x27;
&quot;elementaryStreams&quot;: [ # List of `ElementaryStream.key`s multiplexed in this stream.
&quot;A String&quot;,
],
&quot;encryption&quot;: { # Encryption settings. # Encryption settings.
&quot;aes128&quot;: { # Configuration for AES-128 encryption. # Configuration for AES-128 encryption.
&quot;keyUri&quot;: &quot;A String&quot;, # Required. URI of the key delivery service. This URI is inserted into the M3U8 header.
},
&quot;iv&quot;: &quot;A String&quot;, # Required. 128 bit Initialization Vector (IV) represented as lowercase hexadecimal digits.
&quot;key&quot;: &quot;A String&quot;, # Required. 128 bit encryption key represented as lowercase hexadecimal digits.
&quot;mpegCenc&quot;: { # Configuration for MPEG Common Encryption (MPEG-CENC). # Configuration for MPEG Common Encryption (MPEG-CENC).
&quot;keyId&quot;: &quot;A String&quot;, # Required. 128 bit Key ID represented as lowercase hexadecimal digits for use with common encryption.
&quot;scheme&quot;: &quot;A String&quot;, # Required. Specify the encryption scheme. Supported encryption schemes: - &#x27;cenc&#x27; - &#x27;cbcs&#x27;
},
&quot;sampleAes&quot;: { # Configuration for SAMPLE-AES encryption. # Configuration for SAMPLE-AES encryption.
&quot;keyUri&quot;: &quot;A String&quot;, # Required. URI of the key delivery service. This URI is inserted into the M3U8 header.
},
},
&quot;fileName&quot;: &quot;A String&quot;, # The name of the generated file. The default is `MuxStream.key` with the extension suffix corresponding to the `MuxStream.container`. Individual segments also have an incremental 10-digit zero-padded suffix starting from 0 before the extension, such as `&quot;mux_stream0000000123.ts&quot;`.
&quot;key&quot;: &quot;A String&quot;, # A unique key for this multiplexed stream. HLS media manifests will be named `MuxStream.key` with the `&quot;.m3u8&quot;` extension suffix.
&quot;segmentSettings&quot;: { # Segment settings for `&quot;ts&quot;`, `&quot;fmp4&quot;` and `&quot;vtt&quot;`. # Segment settings for `&quot;ts&quot;`, `&quot;fmp4&quot;` and `&quot;vtt&quot;`.
&quot;individualSegments&quot;: True or False, # Required. Create an individual segment file. The default is `false`.
&quot;segmentDuration&quot;: &quot;A String&quot;, # Duration of the segments in seconds. The default is `&quot;6.0s&quot;`. Note that `segmentDuration` must be greater than or equal to [`gopDuration`](#videostream), and `segmentDuration` must be divisible by [`gopDuration`](#videostream).
},
},
],
&quot;output&quot;: { # Location of output file(s) in a Cloud Storage bucket. # Output configuration.
&quot;uri&quot;: &quot;A String&quot;, # URI for the output file(s). For example, `gs://my-bucket/outputs/`. If empty the value is populated from `Job.output_uri`.
},
&quot;overlays&quot;: [ # List of overlays on the output video, in descending Z-order.
{ # Overlay configuration.
&quot;animations&quot;: [ # List of Animations. The list should be chronological, without any time overlap.
{ # Animation types.
&quot;animationEnd&quot;: { # End previous overlay animation from the video. Without AnimationEnd, the overlay object will keep the state of previous animation until the end of the video. # End previous animation.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to end overlay object, in seconds. Default: 0
},
&quot;animationFade&quot;: { # Display overlay object with fade animation. # Display overlay object with fade animation.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # The time to end the fade animation, in seconds. Default: `start_time_offset` + 1s
&quot;fadeType&quot;: &quot;A String&quot;, # Required. Type of fade animation: `FADE_IN` or `FADE_OUT`.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to start the fade animation, in seconds. Default: 0
&quot;xy&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized coordinates based on output video resolution. Valid values: `0.0`–`1.0`. `xy` is the upper-left coordinate of the overlay object. For example, use the x and y coordinates {0,0} to position the top-left corner of the overlay animation in the top-left corner of the output video.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
},
&quot;animationStatic&quot;: { # Display static overlay object. # Display static overlay object.
&quot;startTimeOffset&quot;: &quot;A String&quot;, # The time to start displaying the overlay object, in seconds. Default: 0
&quot;xy&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized coordinates based on output video resolution. Valid values: `0.0`–`1.0`. `xy` is the upper-left coordinate of the overlay object. For example, use the x and y coordinates {0,0} to position the top-left corner of the overlay animation in the top-left corner of the output video.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
},
},
],
&quot;image&quot;: { # Overlaid jpeg image. # Image overlay.
&quot;alpha&quot;: 3.14, # Target image opacity. Valid values are from `1.0` (solid, default) to `0.0` (transparent), exclusive. Set this to a value greater than `0.0`.
&quot;resolution&quot;: { # 2D normalized coordinates. Default: `{0.0, 0.0}` # Normalized image resolution, based on output video resolution. Valid values: `0.0`–`1.0`. To respect the original image aspect ratio, set either `x` or `y` to `0.0`. To use the original image resolution, set both `x` and `y` to `0.0`.
&quot;x&quot;: 3.14, # Normalized x coordinate.
&quot;y&quot;: 3.14, # Normalized y coordinate.
},
&quot;uri&quot;: &quot;A String&quot;, # Required. URI of the JPEG image in Cloud Storage. For example, `gs://bucket/inputs/image.jpeg`. JPEG is the only supported image type.
},
},
],
&quot;pubsubDestination&quot;: { # A Pub/Sub destination. # Destination on Pub/Sub.
&quot;topic&quot;: &quot;A String&quot;, # The name of the Pub/Sub topic to publish job completion notification to. For example: `projects/{project}/topics/{topic}`.
},
&quot;spriteSheets&quot;: [ # List of output sprite sheets.
{ # Sprite sheet configuration.
&quot;columnCount&quot;: 42, # The maximum number of sprites per row in a sprite sheet. The default is 0, which indicates no maximum limit.
&quot;endTimeOffset&quot;: &quot;A String&quot;, # End time in seconds, relative to the output file timeline. When `end_time_offset` is not specified, the sprites are generated until the end of the output file.
&quot;filePrefix&quot;: &quot;A String&quot;, # Required. File name prefix for the generated sprite sheets. Each sprite sheet has an incremental 10-digit zero-padded suffix starting from 0 before the extension, such as `&quot;sprite_sheet0000000123.jpeg&quot;`.
&quot;format&quot;: &quot;A String&quot;, # Format type. The default is `&quot;jpeg&quot;`. Supported formats: - &#x27;jpeg&#x27;
&quot;interval&quot;: &quot;A String&quot;, # Starting from `0s`, create sprites at regular intervals. Specify the interval value in seconds.
&quot;quality&quot;: 42, # The quality of the generated sprite sheet. Enter a value between 1 and 100, where 1 is the lowest quality and 100 is the highest quality. The default is 100. A high quality value corresponds to a low image data compression ratio.
&quot;rowCount&quot;: 42, # The maximum number of rows per sprite sheet. When the sprite sheet is full, a new sprite sheet is created. The default is 0, which indicates no maximum limit.
&quot;spriteHeightPixels&quot;: 42, # Required. The height of sprite in pixels. Must be an even integer. To preserve the source aspect ratio, set the SpriteSheet.sprite_height_pixels field or the SpriteSheet.sprite_width_pixels field, but not both (the API will automatically calculate the missing field).
&quot;spriteWidthPixels&quot;: 42, # Required. The width of sprite in pixels. Must be an even integer. To preserve the source aspect ratio, set the SpriteSheet.sprite_width_pixels field or the SpriteSheet.sprite_height_pixels field, but not both (the API will automatically calculate the missing field).
&quot;startTimeOffset&quot;: &quot;A String&quot;, # Start time in seconds, relative to the output file timeline. Determines the first sprite to pick. The default is `0s`.
&quot;totalCount&quot;: 42, # Total number of sprites. Create the specified number of sprites distributed evenly across the timeline of the output media. The default is 100.
},
],
},
&quot;name&quot;: &quot;A String&quot;, # The resource name of the job template. Format: `projects/{project}/locations/{location}/jobTemplates/{job_template}`
},
],
&quot;nextPageToken&quot;: &quot;A String&quot;, # The pagination token.
}</pre>
</div>
<div class="method">
<code class="details" id="list_next">list_next(previous_request, previous_response)</code>
<pre>Retrieves the next page of results.
Args:
previous_request: The request for the previous page. (required)
previous_response: The response from the request for the previous page. (required)
Returns:
A request object that you can call &#x27;execute()&#x27; on to request the next
page. Returns None if there are no more items in the collection.
</pre>
</div>
</body></html>