Microsoft MVP성태의 닷넷 이야기
글쓴 사람
정성태 (techsharer at outlook.com)
홈페이지
첨부 파일

C# - ffmpeg(FFmpeg.AutoGen) AVFormatContext를 이용해 ffprobe처럼 정보 출력

ffmpeg에서 하나의 동영상 정보를 취급하는 단위가 AVFormatContext입니다. 이것을 얻으려면 우선 avformat_open_input을 호출하면 됩니다.

string filePath = @"D:\media_sample\output2.mp4";

AVFormatContext* av_context = null;
int ret = ffmpeg.avformat_open_input(&av_context, filePath, null, null);

avformat_open_input에 av_context 변수를 null로 넣으면 할당해서 반환합니다. 혹은, 미리 avformat_alloc_context 함수를 이용해 할당해도 됩니다.

AVFormatContext* av_context = ffmpeg.avformat_alloc_context();
int ret = ffmpeg.avformat_open_input(&av_context, filePath, null, null);

일단, 이렇게 열고나면 AVFormatContext에 있는 몇몇 정보를 조회할 수 있습니다. 지난 글에서 메타데이터를 조회하는 예제를 올렸는데,

C# - ffmpeg(FFmpeg.AutoGen)를 이용해 멀티미디어 파일의 메타데이터를 보여주는 예제(metadata.c)
; https://www.sysnet.pe.kr/2/0/12936

그 코드에서는 avformat_open_input + avformat_find_stream_info 함수를 호출했었죠? 위의 테스트를 해보니 굳이 avformat_find_stream_info 함수는 호출하지 않아도 조회가 되었습니다. (어쨌든, 이렇게라도 정리를 해서 접근하니 뭔가 손에 잡히는 것 같은 기분입니다. ^^)

참고로, 이렇게 AVFormatContext 정보를 avformat_open_input으로 얻어도 해당 구조체의 모든 내용이 채워지는 것은 아닙니다. 가령, bitrate 등의 정보는 avformat_find_stream_info 함수를 추가 호출해야 합니다.

ffmpeg.avformat_find_stream_info(av_context, null);

마지막으로 AVFormatContext의 사용을 마쳤으면 avformat_close_input을 이용해 자원을 해제할 수 있습니다. 그리고 이를 종합하면 다음과 같은 식으로 예제 코드를 정리할 수 있습니다.

string filePath = @"D:\media_sample\output2.mp4";

int ret = ffmpeg.avformat_open_input(&av_context, filePath, null, null);
if (ret != 0)
{
    return;
}

// ...[avformat_open_input으로 얻을 수 있는 정보 조회]...

ffmpeg.avformat_find_stream_info(av_context, null);

// ...[avformat_find_stream_info로 얻을 수 있는 정보 조회]...

ffmpeg.avformat_close_input(&av_context);

그나저나, AVFormatContext에는 정말 많은 정보들이 있는데요,

    //
    // Summary:
    //     Format I/O context. New fields can be added to the end with minor version bumps.
    //     Removal, reordering and changes to existing fields require a major version bump.
    //     sizeof(AVFormatContext) must not be used outside libav*, use avformat_alloc_context()
    //     to create an AVFormatContext.
    public struct AVFormatContext
    {
        //
        // Summary:
        //     A class for logging and avoptions. Set by avformat_alloc_context(). Exports (de)muxer
        //     private options if they exist.
        public unsafe AVClass* av_class;

        //
        // Summary:
        //     The input container format.
        public unsafe AVInputFormat* iformat;

        //
        // Summary:
        //     The output container format.
        public unsafe AVOutputFormat* oformat;

        //
        // Summary:
        //     Format private data. This is an AVOptions-enabled struct if and only if iformat/oformat.priv_class
        //     is not NULL.
        public unsafe void* priv_data;

        //
        // Summary:
        //     I/O context.
        public unsafe AVIOContext* pb;

        //
        // Summary:
        //     Flags signalling stream properties. A combination of AVFMTCTX_*. Set by libavformat.
        public int ctx_flags;

        //
        // Summary:
        //     Number of elements in AVFormatContext.streams.
        public uint nb_streams;

        //
        // Summary:
        //     A list of all streams in the file. New streams are created with avformat_new_stream().
        public unsafe AVStream** streams;

        //
        // Summary:
        //     input or output filename
        [Obsolete("Use url instead.")]
        public byte_array1024 filename;

        //
        // Summary:
        //     input or output URL. Unlike the old filename field, this field has no length
        //     restriction.
        public unsafe byte* url;

        //
        // Summary:
        //     Position of the first frame of the component, in AV_TIME_BASE fractional seconds.
        //     NEVER set this value directly: It is deduced from the AVStream values.
        public long start_time;

        //
        // Summary:
        //     Duration of the stream, in AV_TIME_BASE fractional seconds. Only set this value
        //     if you know none of the individual stream durations and also do not set any of
        //     them. This is deduced from the AVStream values if not set.
        public long duration;

        //
        // Summary:
        //     Total stream bitrate in bit/s, 0 if not available. Never set it directly if the
        //     file_size and the duration are known as FFmpeg can compute it automatically.
        public long bit_rate;

        public uint packet_size;

        public int max_delay;

        //
        // Summary:
        //     Flags modifying the (de)muxer behaviour. A combination of AVFMT_FLAG_*. Set by
        //     the user before avformat_open_input() / avformat_write_header().
        public int flags;

        //
        // Summary:
        //     Maximum size of the data read from input for determining the input container
        //     format. Demuxing only, set by the caller before avformat_open_input().
        public long probesize;

        //
        // Summary:
        //     Maximum duration (in AV_TIME_BASE units) of the data read from input in avformat_find_stream_info().
        //     Demuxing only, set by the caller before avformat_find_stream_info(). Can be set
        //     to 0 to let avformat choose using a heuristic.
        public long max_analyze_duration;

        public unsafe byte* key;

        public int keylen;

        public uint nb_programs;

        public unsafe AVProgram** programs;

        //
        // Summary:
        //     Forced video codec_id. Demuxing: Set by user.
        public AVCodecID video_codec_id;

        //
        // Summary:
        //     Forced audio codec_id. Demuxing: Set by user.
        public AVCodecID audio_codec_id;

        //
        // Summary:
        //     Forced subtitle codec_id. Demuxing: Set by user.
        public AVCodecID subtitle_codec_id;

        //
        // Summary:
        //     Maximum amount of memory in bytes to use for the index of each stream. If the
        //     index exceeds this size, entries will be discarded as needed to maintain a smaller
        //     size. This can lead to slower or less accurate seeking (depends on demuxer).
        //     Demuxers for which a full in-memory index is mandatory will ignore this. - muxing:
        //     unused - demuxing: set by user
        public uint max_index_size;

        //
        // Summary:
        //     Maximum amount of memory in bytes to use for buffering frames obtained from realtime
        //     capture devices.
        public uint max_picture_buffer;

        //
        // Summary:
        //     Number of chapters in AVChapter array. When muxing, chapters are normally written
        //     in the file header, so nb_chapters should normally be initialized before write_header
        //     is called. Some muxers (e.g. mov and mkv) can also write chapters in the trailer.
        //     To write chapters in the trailer, nb_chapters must be zero when write_header
        //     is called and non-zero when write_trailer is called. - muxing: set by user -
        //     demuxing: set by libavformat
        public uint nb_chapters;

        public unsafe AVChapter** chapters;

        //
        // Summary:
        //     Metadata that applies to the whole file.
        public unsafe AVDictionary* metadata;

        //
        // Summary:
        //     Start time of the stream in real world time, in microseconds since the Unix epoch
        //     (00:00 1st January 1970). That is, pts=0 in the stream was captured at this real
        //     world time. - muxing: Set by the caller before avformat_write_header(). If set
        //     to either 0 or AV_NOPTS_VALUE, then the current wall-time will be used. - demuxing:
        //     Set by libavformat. AV_NOPTS_VALUE if unknown. Note that the value may become
        //     known after some number of frames have been received.
        public long start_time_realtime;

        //
        // Summary:
        //     The number of frames used for determining the framerate in avformat_find_stream_info().
        //     Demuxing only, set by the caller before avformat_find_stream_info().
        public int fps_probe_size;

        //
        // Summary:
        //     Error recognition; higher values will detect more errors but may misdetect some
        //     more or less valid parts as errors. Demuxing only, set by the caller before avformat_open_input().
        public int error_recognition;

        //
        // Summary:
        //     Custom interrupt callbacks for the I/O layer.
        public AVIOInterruptCB interrupt_callback;

        //
        // Summary:
        //     Flags to enable debugging.
        public int debug;

        //
        // Summary:
        //     Maximum buffering duration for interleaving.
        public long max_interleave_delta;

        //
        // Summary:
        //     Allow non-standard and experimental extension
        public int strict_std_compliance;

        //
        // Summary:
        //     Flags indicating events happening on the file, a combination of AVFMT_EVENT_FLAG_*.
        public int event_flags;

        //
        // Summary:
        //     Maximum number of packets to read while waiting for the first timestamp. Decoding
        //     only.
        public int max_ts_probe;

        //
        // Summary:
        //     Avoid negative timestamps during muxing. Any value of the AVFMT_AVOID_NEG_TS_*
        //     constants. Note, this only works when using av_interleaved_write_frame. (interleave_packet_per_dts
        //     is in use) - muxing: Set by user - demuxing: unused
        public int avoid_negative_ts;

        //
        // Summary:
        //     Transport stream id. This will be moved into demuxer private options. Thus no
        //     API/ABI compatibility
        public int ts_id;

        //
        // Summary:
        //     Audio preload in microseconds. Note, not all formats support this and unpredictable
        //     things may happen if it is used when not supported. - encoding: Set by user -
        //     decoding: unused
        public int audio_preload;

        //
        // Summary:
        //     Max chunk time in microseconds. Note, not all formats support this and unpredictable
        //     things may happen if it is used when not supported. - encoding: Set by user -
        //     decoding: unused
        public int max_chunk_duration;

        //
        // Summary:
        //     Max chunk size in bytes Note, not all formats support this and unpredictable
        //     things may happen if it is used when not supported. - encoding: Set by user -
        //     decoding: unused
        public int max_chunk_size;

        //
        // Summary:
        //     forces the use of wallclock timestamps as pts/dts of packets This has undefined
        //     results in the presence of B frames. - encoding: unused - decoding: Set by user
        public int use_wallclock_as_timestamps;

        //
        // Summary:
        //     avio flags, used to force AVIO_FLAG_DIRECT. - encoding: unused - decoding: Set
        //     by user
        public int avio_flags;

        //
        // Summary:
        //     The duration field can be estimated through various ways, and this field can
        //     be used to know how the duration was estimated. - encoding: unused - decoding:
        //     Read by user
        public AVDurationEstimationMethod duration_estimation_method;

        //
        // Summary:
        //     Skip initial bytes when opening stream - encoding: unused - decoding: Set by
        //     user
        public long skip_initial_bytes;

        //
        // Summary:
        //     Correct single timestamp overflows - encoding: unused - decoding: Set by user
        public uint correct_ts_overflow;

        //
        // Summary:
        //     Force seeking to any (also non key) frames. - encoding: unused - decoding: Set
        //     by user
        public int seek2any;

        //
        // Summary:
        //     Flush the I/O context after each packet. - encoding: Set by user - decoding:
        //     unused
        public int flush_packets;

        //
        // Summary:
        //     format probing score. The maximal score is AVPROBE_SCORE_MAX, its set when the
        //     demuxer probes the format. - encoding: unused - decoding: set by avformat, read
        //     by user
        public int probe_score;

        //
        // Summary:
        //     number of bytes to read maximally to identify format. - encoding: unused - decoding:
        //     set by user
        public int format_probesize;

        //
        // Summary:
        //     ',' separated list of allowed decoders. If NULL then all are allowed - encoding:
        //     unused - decoding: set by user
        public unsafe byte* codec_whitelist;

        //
        // Summary:
        //     ',' separated list of allowed demuxers. If NULL then all are allowed - encoding:
        //     unused - decoding: set by user
        public unsafe byte* format_whitelist;

        //
        // Summary:
        //     An opaque field for libavformat internal usage. Must not be accessed in any way
        //     by callers.
        public unsafe AVFormatInternal* @internal;

        //
        // Summary:
        //     IO repositioned flag. This is set by avformat when the underlaying IO context
        //     read pointer is repositioned, for example when doing byte based seeking. Demuxers
        //     can use the flag to detect such changes.
        public int io_repositioned;

        //
        // Summary:
        //     Forced video codec. This allows forcing a specific decoder, even when there are
        //     multiple with the same codec_id. Demuxing: Set by user
        public unsafe AVCodec* video_codec;

        //
        // Summary:
        //     Forced audio codec. This allows forcing a specific decoder, even when there are
        //     multiple with the same codec_id. Demuxing: Set by user
        public unsafe AVCodec* audio_codec;

        //
        // Summary:
        //     Forced subtitle codec. This allows forcing a specific decoder, even when there
        //     are multiple with the same codec_id. Demuxing: Set by user
        public unsafe AVCodec* subtitle_codec;

        //
        // Summary:
        //     Forced data codec. This allows forcing a specific decoder, even when there are
        //     multiple with the same codec_id. Demuxing: Set by user
        public unsafe AVCodec* data_codec;

        //
        // Summary:
        //     Number of bytes to be written as padding in a metadata header. Demuxing: Unused.
        //     Muxing: Set by user via av_format_set_metadata_header_padding.
        public int metadata_header_padding;

        //
        // Summary:
        //     User data. This is a place for some private data of the user.
        public unsafe void* opaque;

        //
        // Summary:
        //     Callback used by devices to communicate with application.
        public AVFormatContext_control_message_cb_func control_message_cb;

        //
        // Summary:
        //     Output timestamp offset, in microseconds. Muxing: set by user
        public long output_ts_offset;

        //
        // Summary:
        //     dump format separator. can be ", " or " " or anything else - muxing: Set by user.
        //     - demuxing: Set by user.
        public unsafe byte* dump_separator;

        //
        // Summary:
        //     Forced Data codec_id. Demuxing: Set by user.
        public AVCodecID data_codec_id;

        //
        // Summary:
        //     Called to open further IO contexts when needed for demuxing.
        [Obsolete("Use io_open and io_close.")]
        public AVFormatContext_open_cb_func open_cb;

        //
        // Summary:
        //     ',' separated list of allowed protocols. - encoding: unused - decoding: set by
        //     user
        public unsafe byte* protocol_whitelist;

        //
        // Summary:
        //     A callback for opening new IO streams.
        public AVFormatContext_io_open_func io_open;

        //
        // Summary:
        //     A callback for closing the streams opened with AVFormatContext.io_open().
        public AVFormatContext_io_close_func io_close;

        //
        // Summary:
        //     ',' separated list of disallowed protocols. - encoding: unused - decoding: set
        //     by user
        public unsafe byte* protocol_blacklist;

        //
        // Summary:
        //     The maximum number of streams. - encoding: unused - decoding: set by user
        public int max_streams;

        //
        // Summary:
        //     Skip duration calcuation in estimate_timings_from_pts. - encoding: unused - decoding:
        //     set by user
        public int skip_estimate_duration_from_pts;

        //
        // Summary:
        //     Maximum number of packets that can be probed - encoding: unused - decoding: set
        //     by user
        public int max_probe_packets;
    }
}

저도 아직 초보라 많은 것들을 알 수 없고, 겨우 다음과 같은 식으로 조회를 해봤습니다.

AVFormatContext* av_context = ffmpeg.avformat_alloc_context(); // null;

string filePath = @"D:\media_sample\output2.mp4";

int ret = ffmpeg.avformat_open_input(&av_context, filePath, null, null);
if (ret != 0)
{
    return;
}

Console.WriteLine("[metadata]");

WriteMetadata(av_context->metadata);

Console.WriteLine();
Console.WriteLine("[context-info]");
Console.WriteLine($"duration: {(decimal)av_context->duration / ffmpeg.AV_TIME_BASE}");
Console.WriteLine($"# of streams: {av_context->nb_streams}");
Console.WriteLine($"url: {Marshal.PtrToStringAnsi(new IntPtr(av_context->url))}");

ffmpeg.avformat_find_stream_info(av_context, null);
Console.WriteLine($"start time: {av_context->start_time}");
Console.WriteLine($"bitrate: {(decimal)av_context->bit_rate / 1000} kb/s");

for (int i = 0; i < av_context->nb_streams; i++)
{
    Console.WriteLine();
    AVStream* stream = av_context->streams[i];
    AVCodecParameters* codecpar = stream->codecpar;
    AVCodecContext* codec = stream->codec;
    Console.WriteLine($"Stream #0:{stream->index} {codecpar->codec_type} {codecpar->codec_id} (0x{codecpar->codec_tag.ToString("x")})");

    if (codecpar->codec_type == AVMediaType.AVMEDIA_TYPE_VIDEO)
    {
        Console.Write($"{codecpar->width}x{codecpar->height}, {stream->codec->pix_fmt}, {(AVPixelFormat)codecpar->format}, {codecpar->color_primaries}, {((float)codec->framerate.num / codec->framerate.den):.00} fps");
        Console.Write($", {stream->codec->pkt_timebase.den / stream->codec->pkt_timebase.num / 1000}k tbn");

        float tbc = (float)stream->codec->time_base.den / stream->codec->time_base.num;
        if (tbc > 1000)
        {
            Console.Write($", {tbc / 1000}k tbc");
        }
        else
        {
            Console.Write($", {tbc:.00} tbc");
        }
        Console.WriteLine();
    }
    else
    {
        Console.WriteLine($"{codecpar->sample_rate} Hz, {codecpar->channels} Channel(s), {codecpar->bit_rate / 1000} kb/s, {(AVSampleFormat)codecpar->format}");
    }

    WriteMetadata(stream->metadata);
}

위의 코드를 지난 글에서 설명한 4개의 스트림을 갖는 파일을 입력으로 실행하면 다음과 같은 출력 정보를 얻을 수 있습니다.

[metadata]
major_brand = isom
minor_version = 512
compatible_brands = isomiso2avc1mp41
encoder = Lavf58.45.100

[context-info]
duration: 187.954
# of streams: 4
url: D:\media_sample\output2.mp4
start time: 0
bitrate: 3553.945 kb/s

Stream #0:0 AVMEDIA_TYPE_VIDEO AV_CODEC_ID_H264 (0x31637661)
1920x1080, AV_PIX_FMT_YUV420P, AV_PIX_FMT_YUV420P, AVCOL_PRI_BT709, 29.97 fps, 16k tbn, 59.94 tbc
language = und
handler_name = ISO Media file produced by Google Inc.
vendor_id = [0][0][0][0]

Stream #0:1 AVMEDIA_TYPE_AUDIO AV_CODEC_ID_AAC (0x6134706d)
48000 Hz, 2 Channel(s), 128 kb/s, AV_SAMPLE_FMT_FLTP
language = eng
handler_name = SoundHandler
vendor_id = [0][0][0][0]

Stream #0:2 AVMEDIA_TYPE_AUDIO AV_CODEC_ID_AAC (0x6134706d)
44100 Hz, 2 Channel(s), 128 kb/s, AV_SAMPLE_FMT_FLTP
language = eng
handler_name = ISO Media file produced by Google Inc.
vendor_id = [0][0][0][0]

Stream #0:3 AVMEDIA_TYPE_VIDEO AV_CODEC_ID_VP9 (0x39307076)
640x360, AV_PIX_FMT_YUV420P, AV_PIX_FMT_YUV420P, AVCOL_PRI_BT709, 29.97 fps, 16k tbn, 16k tbc
language = eng
handler_name = VideoHandler
vendor_id = [0][0][0][0]

이 정보를 ffprobe로 알아낸 것과 비교해 볼까요? ^^

D:\media_sample> ffprobe output2.mp4
...[생략]...
  libpostproc    55.  9.100 / 55.  9.100
Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'output2.mp4':
  Metadata:
    major_brand     : isom
    minor_version   : 512
    compatible_brands: isomiso2avc1mp41
    encoder         : Lavf58.45.100
  Duration: 00:03:07.95, start: 0.000000, bitrate: 3553 kb/s
  Stream #0:0(und): Video: h264 (High) (avc1 / 0x31637661), yuv420p(tv, bt709), 1920x1080 [SAR 1:1 DAR 16:9], 3070 kb/s, 29.97 fps, 29.97 tbr, 16k tbn, 59.94 tbc (default)
    Metadata:
      handler_name    : ISO Media file produced by Google Inc.
      vendor_id       : [0][0][0][0]
  Stream #0:1(eng): Audio: aac (LC) (mp4a / 0x6134706D), 48000 Hz, stereo, fltp, 128 kb/s (default)
    Metadata:
      handler_name    : SoundHandler
      vendor_id       : [0][0][0][0]
  Stream #0:2(eng): Audio: aac (LC) (mp4a / 0x6134706D), 44100 Hz, stereo, fltp, 128 kb/s (default)
    Metadata:
      handler_name    : ISO Media file produced by Google Inc.
      vendor_id       : [0][0][0][0]
  Stream #0:3(eng): Video: vp9 (Profile 0) (vp09 / 0x39307076), yuv420p(tv, bt709), 640x360, 212 kb/s, SAR 1:1 DAR 16:9, 29.97 fps, 29.97 tbr, 16k tbn, 16k tbc (default)
    Metadata:
      handler_name    : VideoHandler
      vendor_id       : [0][0][0][0]

대충 유사하게 구할 수 있군요. 나머지 정보들은 시간이 지나면 천천히 해결이 되겠고. ^^




(2022-02-01: 업데이트)

ffprobe의 출력 결과를 그대로 얻을 수 있는 av_dump_format 함수가 있군요. ^^

int videoIndex = -1;
int audioIndex = -1;

{
    videoIndex = ffmpeg.av_find_best_stream(av_context, AVMediaType.AVMEDIA_TYPE_VIDEO, -1, -1, null, 0);
    audioIndex = ffmpeg.av_find_best_stream(av_context, AVMediaType.AVMEDIA_TYPE_AUDIO, -1, -1 /* 또는 videoIndex */, null, 0);

    ffmpeg.av_dump_format(av_context, videoIndex, filePath, 0);
}

(첨부 파일은 이 글의 예제 코드를 포함합니다.)




[이 글에 대해서 여러분들과 의견을 공유하고 싶습니다. 틀리거나 미흡한 부분 또는 의문 사항이 있으시면 언제든 댓글 남겨주십시오.]

[연관 글]






[최초 등록일: ]
[최종 수정일: 2/1/2022]

Creative Commons License
이 저작물은 크리에이티브 커먼즈 코리아 저작자표시-비영리-변경금지 2.0 대한민국 라이센스에 따라 이용하실 수 있습니다.
by SeongTae Jeong, mailto:techsharer at outlook.com

비밀번호

댓글 작성자
 




1  2  3  4  5  [6]  7  8  9  10  11  12  13  14  15  ...
NoWriterDateCnt.TitleFile(s)
12991정성태3/2/2022971.NET Framework: 1170. C# - ffmpeg(FFmpeg.AutoGen)를 이용한 transcode_aac.c 예제 포팅
12990정성태3/2/2022980오류 유형: 797. msbuild - The BaseOutputPath/OutputPath property is not set for project '[...].vcxproj'
12989정성태3/2/2022947오류 유형: 796. mstest.exe - System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.VisualStudio.QualityTools.Tips.WebLoadTest.Tip
12988정성태3/2/2022782오류 유형: 795. CI 환경에서 Docker build 시 csproj의 Link 파일에 대한 빌드 오류
12987정성태3/1/2022946.NET Framework: 1169. C# - ffmpeg(FFmpeg.AutoGen)를 이용한 demuxing_decoding.c 예제 포팅
12986정성태2/28/2022961.NET Framework: 1168. C# -IIncrementalGenerator를 적용한 Version 2 Source Generator 실습
12985정성태2/28/20221037.NET Framework: 1167. C# -Version 1 Source Generator 실습
12984정성태2/24/2022900.NET Framework: 1166. C# - ffmpeg(FFmpeg.AutoGen)를 이용한 filtering_video.c 예제 포팅
12983정성태2/24/2022885.NET Framework: 1165. .NET Core/5+ 빌드 시 runtimeconfig.json에 설정을 반영하는 방법
12982정성태2/24/2022863.NET Framework: 1164. HTTP Error 500.31 - ANCM Failed to Find Native Dependencies
12981정성태2/23/2022915VC++: 154. C/C++ 언어의 문자열 Literal에 인덱스 적용하는 구문 [1]
12980정성태2/23/20221141.NET Framework: 1163. C# - 윈도우 환경에서 usleep을 호출하는 방법 [2]
12979정성태2/22/20221394.NET Framework: 1162. C# - 인텔 CPU의 P-Core와 E-Core를 구분하는 방법파일 다운로드1
12978정성태2/21/2022876.NET Framework: 1161. C# - ffmpeg(FFmpeg.AutoGen)를 이용한 resampling_audio.c 예제 포팅
12977정성태2/21/20222161.NET Framework: 1160. C# - ffmpeg(FFmpeg.AutoGen)를 이용한 qsv 디코딩
12976정성태2/21/20221324VS.NET IDE: 174. Visual C++ - "External Dependencies" 노드 비활성화하는 방법
12975정성태2/20/20221203.NET Framework: 1159. C# - ffmpeg(FFmpeg.AutoGen)를 이용한 qsvdec.c 예제 포팅파일 다운로드1
12974정성태2/20/2022878.NET Framework: 1158. C# - SqlConnection의 최소 Pooling 수를 초과한 DB 연결은 언제 해제될까요?
12973정성태2/16/20221114개발 환경 구성: 639. ffmpeg.exe - Intel Quick Sync Video(qsv)를 이용한 인코딩 [3]
12972정성태2/16/2022932Windows: 200. Intel CPU의 내장 그래픽 GPU가 작업 관리자에 없다면? [4]
12971정성태2/15/20221866.NET Framework: 1157. C# - ffmpeg(FFmpeg.AutoGen)를 이용한 muxing.c 예제 포팅 [7]파일 다운로드1
12970정성태2/15/20221086.NET Framework: 1156. C# - ffmpeg(FFmpeg.AutoGen): Bitmap으로부터 h264 형식의 파일로 쓰기 [1]파일 다운로드1
12969정성태2/14/2022896개발 환경 구성: 638. Visual Studio의 Connection Manager 기능(Remote SSH 관리)을 위한 명령행 도구 - 두 번째 이야기파일 다운로드1
12968정성태2/14/2022661오류 유형: 794. msbuild 에러 - error NETSDK1005: Assets file '...\project.assets.json' doesn't have a target for '...'.
12967정성태2/14/2022872VC++: 153. Visual C++ - C99 표준의 Compund Literals 빌드 방법 [4]
12966정성태2/13/2022882.NET Framework: 1155. C# - ffmpeg(FFmpeg.AutoGen): Bitmap으로부터 yuv420p + rawvideo 형식의 파일로 쓰기파일 다운로드1
1  2  3  4  5  [6]  7  8  9  10  11  12  13  14  15  ...