Media (Intel® Video Processing Library, Intel Media SDK)
Access community support with transcoding, decoding, and encoding in applications using media tools like Intel® oneAPI Video Processing Library and Intel® Media SDK
Announcements
The Intel Media SDK project is no longer active. For continued support and access to new features, Intel Media SDK users are encouraged to read the transition guide on upgrading from Intel® Media SDK to Intel® Video Processing Library (VPL), and to move to VPL as soon as possible.
For more information, see the VPL website.

Joining videos together with rotation

Liron_K_
Beginner
537 Views

I have an app which records video from both the front and back camera, and I want to be able to stitch the videos together into one long video.

When I try this, (with the following code), any videos from the rear camera are flipped upside down.

Basically, what I think I need is a way to join the videos with setting a different rotation effect on the different video segments.

Additionally, the performance is pretty slow - is there a way to speed this up?

Thanks,

Liron

Code:

public Task<Void> mergeVideos(List<RecordedFileInfo> files, final File outputFile, Context c)
{
  final Task<Void>.TaskCompletionSource task = Task.create();
  IProgressListener progressListener = new IProgressListener()
  {
    @Override
    public void onMediaStart()
    {
      Logger.d(TAG, "onMediaStart", null);
    }

    @Override
    public void onMediaProgress(float progress)
    {
      final float mediaProgress = progress;
    }

    @Override
    public void onMediaDone()
    {
      Logger.d(TAG, "onMediaDone", null);
      if(outputFile.exists())
      {
        Logger.d(TAG, "onMediaDone", "Success");
        task.setResult(null);
      }
      else
      {
        Logger.d(TAG, "onMediaDone", "Failed");
        task.setError(new Exception("Something went wrong during video merge"));
      }
    }

    @Override
    public void onMediaPause()
    {
      Logger.d(TAG, "onMediaPause", null);
    }

    @Override
    public void onMediaStop()
    {
      Logger.d(TAG, "onMediaStop", null);
    }

    @Override
    public void onError(Exception exception)
    {
      Logger.LogException(TAG, "onError", exception);
      final Exception e = exception;
      task.setError(e);
    }
  };

  AndroidMediaObjectFactory factory = new AndroidMediaObjectFactory(c);
  MediaComposer composer = new MediaComposer(factory, progressListener);

  MediaFileInfo mediaFileInfo = null;
  long i = 0;
  try
  {
    for(RecordedFileInfo f : files)
    {
      //Get rotation
      File file = f.getFile();
      MediaMetadataRetriever retriever = new MediaMetadataRetriever();
      retriever.setDataSource(file.getAbsolutePath());
      String extractMetadata = retriever.extractMetadata(MediaMetadataRetriever.METADATA_KEY_VIDEO_ROTATION);
      int angle = Integer.parseInt(extractMetadata);

      android.net.Uri uri = android.net.Uri.fromFile(file);
      com.intel.inde.mp.Uri fileUri = new com.intel.inde.mp.Uri(uri.toString());
      composer.addSourceFile(fileUri);
      if(mediaFileInfo == null)
      {
        composer.setTargetFile(outputFile.getAbsolutePath());
        mediaFileInfo = new MediaFileInfo(new AndroidMediaObjectFactory(c));
        mediaFileInfo.setUri(fileUri);
        AudioFormat af = (AudioFormat)mediaFileInfo.getAudioFormat();
        VideoFormat vf = (VideoFormat)mediaFileInfo.getVideoFormat();

        configureVideoEncoder(composer, vf, angle);
        configureAudioEncoder(composer, af);
      }

      //Fix rotation;
      RotateEffect effect = new RotateEffect(angle, EglUtil.getInstance());
      effect.setSegment(new Pair<Long, Long>(0L, 0L));  // Apply to all stream
      composer.addVideoEffect(effect);

      i++;
    }
    composer.start();
  }
  catch(Exception ex)
  {
    Logger.LogException(TAG, "Failed to setup video merger", ex);
    return Task.forError(ex);
  }
  return task.getTask();
}
protected void configureVideoEncoder(MediaComposer mediaComposer, VideoFormat formatToCopy, int angle)
{
  VideoFormatAndroid targetFormat;
  Resolution frameSize = formatToCopy.getVideoFrameSize();
  String videoMimeType = formatToCopy.getMimeType();
  if(angle == 90 || angle == 270 || angle == -90)
  {
    frameSize = new Resolution(frameSize.height(), frameSize.width());
  }
  targetFormat = new VideoFormatAndroid(videoMimeType, frameSize.width(), frameSize.height());

  //TODO: Get this value from somewhere
  try
  {
    targetFormat.setVideoBitRateInKBytes(formatToCopy.getVideoBitRateInKBytes());
  }
  catch(RuntimeException ex)
  {
    targetFormat.setVideoBitRateInKBytes(5000);//formatToCopy.getVideoBitRateInKBytes());
  }
  try
  {
    targetFormat.setVideoFrameRate(formatToCopy.getVideoFrameRate());
  }
  catch(RuntimeException ex)
  {
    targetFormat.setVideoFrameRate(30);
  }
  try
  {
    targetFormat.setVideoIFrameInterval(formatToCopy.getVideoIFrameInterval());
  }
  catch(RuntimeException ex)
  {
    targetFormat.setVideoIFrameInterval(1);
  }
  targetFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 0);
  mediaComposer.setTargetVideoFormat(targetFormat);
}

protected void configureAudioEncoder(MediaComposer mediaComposer, AudioFormat formatToCopy)
{

  AudioFormatAndroid targetFormat;
  int sampleRate, channelCount;
  try
  {
    sampleRate = formatToCopy.getAudioSampleRateInHz();
  }
  catch(RuntimeException ex)
  {
    sampleRate = 48000;
  }
  try
  {
    channelCount = formatToCopy.getAudioChannelCount();
  }
  catch(RuntimeException ex)
  {
    channelCount = 2;
  }
  targetFormat = new AudioFormatAndroid(formatToCopy.getMimeType(), sampleRate, channelCount);

  try
  {
    targetFormat.setAudioBitrateInBytes(formatToCopy.getAudioBitrateInBytes());
  }
  catch(RuntimeException ex)
  {
    targetFormat.setAudioBitrateInBytes(96 * 1024);
  }

  try
  {
    targetFormat.setAudioProfile(formatToCopy.getAudioProfile());
  }
  catch(RuntimeException ex)
  {
    targetFormat.setAudioProfile(MediaCodecInfo.CodecProfileLevel.AACObjectLC);//formatToCopy.getAudioProfile());
  }
  targetFormat.setInteger(MediaFormat.KEY_MAX_INPUT_SIZE, 0);
  mediaComposer.setTargetAudioFormat(targetFormat);
}

public class RotateEffect extends VideoEffect
{
  public RotateEffect(int angle, IEglUtil eglUtil)
  {
    super(angle, eglUtil);
  }
}

 

0 Kudos
5 Replies
Harshdeep_B_Intel
537 Views

Hi,

We have Media for Mobile which includes feature support for joining two videos captured. You can download Media for Mobile from INDE here: https://software.intel.com/en-us/intel-inde and samples from https://software.intel.com/en-us/media-for-mobile-support/code-samples

You can align videos before using the joining feature from Media for mobile. Currently we do not support rotate feature effect. In regard to performance\fps differ from device to device in capturing scenario. Most of devices give normal performance, but there are several exceptions depending on the device.

Thanks,

0 Kudos
Liron_K_
Beginner
537 Views

Please see the code in my original post. I am using Media for Mobile, but not getting the effect that I need.

Since the videos that I have as inputs have different rotations (the front camera is rotated 90 and the back camera is rotated 270) when I join them together without using the RotateEffect, the are both rotated sideways (in opposite directions). Once I add the Rotate effect, then the front camera is rotated correctly and the back camera is upside down.

Also, the code that I have above on a Samsung S4 takes about 10 seconds to run to join 2 10 second videos, which doesn't seem very good to me.

Sincerely,
Liron

0 Kudos
Harshdeep_B_Intel
537 Views

Hi,

Can you please provide us the snapshots of the video when joined by using/not using RotateEffect. Let me check with our team and update you in regard to performance on a Samsung S4 device.

Thanks,

0 Kudos
Liron_K_
Beginner
537 Views

Videos attached.

You can see that in either case, the output is wrong, unfortunately. The one input video has a rotation of 90 and the other one has a rotation of 270, which is what the android Camera is generating when doing the record.

0 Kudos
Nikolay_A_Intel
Employee
537 Views

Hi Liron,

In order to correct rotation you have to change

//Fix rotation;
RotateEffect effect = new RotateEffect(angle, EglUtil.getInstance());
effect.setSegment(new Pair<Long, Long>(0L, 0L));  // Apply to all stream
composer.addVideoEffect(effect);

to something like this:

//Fix rotation;
RotateEffect effect1 = new RotateEffect(<angle1>, EglUtil.getInstance());
effect1.setSegment(new FileSegment(0L, <duration1>));  // Apply to 1st stream
composer.addVideoEffect(effect1);

RotateEffect effect2 = new RotateEffect((<angle2>, EglUtil.getInstance());
effect2.setSegment(new FileSegment(<duration1>, <duration2>));  // Apply to 2nd stream
composer.addVideoEffect(effect2);

 

What performance do you really expect? 2x real-time seems quite good for me

0 Kudos
Reply