JMF를 이용하여 RTP를 전송하려고 합니다.

0
답변 1 JMF를 이용하여 RTP를 전송하려고 합니다. 답변 1 (/p/java)
JMF를 이용하여 U-Law wav파일을 전송할려고 합니다. 샘플 프로그램을 받아다가 테스트를 했는데 어디가 잘못된 것인지 설정한 Port 번호가 아닌 다른 포트번..

JMF를 이용하여 U-Law wav파일을 전송할려고 합니다.

샘플 프로그램을 받아다가 테스트를 했는데

어디가 잘못된 것인지 설정한 Port 번호가 아닌 다른 포트번호(137번, 설정은 4446번으로 했습니다)로 데이터가 나가는거 같고 전송되는 데이터도 RTP 데이터가 아닌거 같습니다.

어떤부분에 문제가 있는지 모르겠네요.

테스트했던 소스는 아래와 같습니다.

어느 부분이 잘못되었을까요?

파일 Size는 정상적으로 가지고 오는거 같은데....

 

import java.awt.*;
import javax.media.*;
import javax.media.protocol.*;
import javax.media.protocol.DataSource;
import javax.media.format.*;
import javax.media.control.TrackControl;
import javax.media.control.QualityControl;
import java.io.*;

public class AudioTransmit {
    // Input MediaLocator
    // Can be a file or http or capture source
    private MediaLocator locator;
    private String ipAddress;
    private String port;
    private Processor processor = null;
    private DataSink rtptransmitter = null;
    private DataSource dataOutput = null;
    public AudioTransmit(MediaLocator locator,
        String ipAddress,
        String port)
    {
        this.locator = locator;
        this.ipAddress = ipAddress;
        this.port = port;
    }

    /**
     * Starts the transmission. Returns null if transmission started ok.
     * Otherwise it returns a string with the reason why the setup failed.
     */
    public synchronized String start()
    {
        String result;
        // Create a processor for the specified media locator
        // and program it to output RTP
        result = createProcessor();
        if (result != null)
            return result;
        // Create an RTP session to transmit the output of the
        // processor to the specified IP address and port no.
        result = createTransmitter();
        if (result != null)
        {
            processor.close();
            processor = null;
            return result;
        }
        // Start the transmission
        processor.start();
        return null;
    }

    /**
     * Stops the transmission if already started
     */
    public void stop()
    {
        synchronized(this)
        {
            if (processor != null)
            {
                processor.stop();
                processor.close();
                processor = null;
                rtptransmitter.close();
                rtptransmitter = null;
            }
        }
    }

    private String createProcessor()
    {
        if (locator == null)
            return "Locator is null";

        DataSource ds;
        DataSource clone;

        try
        {
            ds = Manager.createDataSource(locator);
        } catch (Exception e)
        {
            return "Couldn't create DataSource";
        }

        // Try to create a processor to handle the input media locator
        try
        {
            processor = Manager.createProcessor(ds);
        } catch (NoProcessorException npe)
        {
            return "Couldn't create processor";
        } catch (IOException ioe)
        {
            return "IOException creating processor";
        }

        // Wait for it to configure
        boolean result = waitForState(processor, Processor.Configured);
        if (result == false)
            return "Couldn't configure processor";

        // Get the tracks from the processor
        TrackControl[] tracks = processor.getTrackControls();

        // Do we have atleast one track?
        if (tracks == null || tracks.length < 1)
            return "Couldn't find tracks in processor";

        boolean programmed = false;
        AudioFormat afmt;

        // Search through the tracks for a Audio track
        for (int i = 0; i < tracks.length; i++)
        {
            Format format = tracks[i].getFormat();
            if (tracks[i].isEnabled() && format instanceof AudioFormat && !programmed)
            {
                afmt = (AudioFormat) tracks[i].getFormat();
                AudioFormat ulawFormat = new AudioFormat(AudioFormat.ULAW_RTP,
                    afmt.getSampleRate(),
                    afmt.getSampleSizeInBits(),
                    afmt.getChannels());
                // 8000,4,1);

                tracks[i].setFormat(ulawFormat);
                System.err.println("Audio transmitted as:");
                System.err.println("  " + ulawFormat);
                // Assume succesful
                programmed = true;
            } else
                tracks[i].setEnabled(false);
        }

        if (!programmed)
            return "Couldn't find Audio track";

        // Set the output content descriptor to RAW_RTP
        ContentDescriptor cd = new ContentDescriptor(ContentDescriptor.RAW_RTP);
        processor.setContentDescriptor(cd);

        // Realize the processor. This will internally create a flow
        // graph and attempt to create an output datasource for ULAW/RTP
        // Audio frames.
        result = waitForState(processor, Controller.Realized);
        if (result == false)
            return "Couldn't realize processor";


        // Get the output data source of the processor
        dataOutput = processor.getDataOutput();
        return null;
    }

    // Creates an RTP transmit data sink. This is the easiest way to create
    // an RTP transmitter. The other way is to use the RTPSessionManager API.
    // Using an RTP session manager gives you more control if you wish to
    // fine tune your transmission and set other parameters.
    private String createTransmitter()
    {
        // Create a media locator for the RTP data sink.
        // For example:
        //    rtp://129.130.131.132:42050/Audio
        String rtpURL = "rtp://192.168.10.173:4446/audio"; //" + ipAddress + ":" + port + "/audio";
        MediaLocator outputLocator = new MediaLocator(rtpURL);

        // Create a data sink, open it and start transmission. It will wait
        // for the processor to start sending data. So we need to start the
        // output data source of the processor. We also need to start the
        // processor itself, which is done after this method returns.
        try
        {
            rtptransmitter = Manager.createDataSink(dataOutput, outputLocator);
            rtptransmitter.open();
            rtptransmitter.start();
            dataOutput.start();
        } catch (MediaException me)
        {
            return "Couldn't create RTP data sink";
        } catch (IOException ioe)
        {
            return "Couldn't create RTP data sink";
        }

        return null;
    }

    /****************************************************************
     * Convenience methods to handle processor's state changes.
     ****************************************************************/

    private Integer stateLock = new Integer(0);
    private boolean failed = false;

    Integer getStateLock()
    {
        return stateLock;
    }

    void setFailed()
    {
        failed = true;
    }

    private synchronized boolean waitForState(Processor p, int state)
    {
        p.addControllerListener(new StateListener());
        failed = false;

        // Call the required method on the processor
        if (state == Processor.Configured)
        {
            p.configure();
        } else if (state == Processor.Realized)
        {
            p.realize();
        }

        // Wait until we get an event that confirms the
        // success of the method, or a failure event.
        // See StateListener inner class
        while (p.getState() < state && !failed)
        {
            synchronized(getStateLock())
            {
                try
                {
                    getStateLock().wait();
                } catch (InterruptedException ie)
                {
                    return false;
                }
            }
        }

        if (failed)
            return false;
        else
            return true;
    }

    /****************************************************************
     * Inner Classes
     ****************************************************************/

    class StateListener implements ControllerListener
    {
        public void controllerUpdate(ControllerEvent ce)
        {
            // If there was an error during configure or
            // realize, the processor will be closed
            if (ce instanceof ControllerClosedEvent)
                setFailed();

            // All controller events, send a notification
            // to the waiting thread in waitForState method.
            if (ce instanceof ControllerEvent)
            {
                synchronized(getStateLock())
                {
                    getStateLock().notifyAll();
                }
            }
        }
    }


    /****************************************************************
     * Sample Usage for AudioTransmit class
     ****************************************************************/

    public static void main(String[] args)
    {
        // We need three parameters to do the transmission
        // For example,
        //   java AudioTransmit file:/C:/media/test.mov  129.130.131.132 42050

        /* if (args.length < 3) {
            System.err.println("Usage: AudioTransmit <sourceURL> <destIP> <destPort>");
            System.exit(-1);
        }*/

        // Create a Audio transmit object with the specified params.
        /*     AudioTransmit at = new AudioTransmit(new MediaLocator(args[0]),
             args[1],
             args[2]);*/


        AudioTransmit at = new AudioTransmit(new MediaLocator("file:/D:/1.wav"), "192.168.10.173", "4446");
        // Start the transmission
        String result = at.start();

        // result will be non-null if there was an error. The return
        // value is a String describing the possible error. Print it.
        if (result != null) {
            System.err.println("Error : " + result);
            System.exit(0);
        }

        System.err.println("Start transmission for 5 seconds...");

        // Transmit for 60 seconds and then close the processor
        // This is a safeguard when using a capture data source
        // so that the capture device will be properly released
        // before quitting.
        // The right thing to do would be to have a GUI with a
        // "Stop" button that would call stop on AudioTransmit
        try {
            Thread.currentThread().sleep(5000);
        } catch (InterruptedException ie) {
        }

        // Stop the transmission
        at.stop();

        System.err.println("...transmission ended.");

        System.exit(0);
    }
}

 

1개월 전 부천임군
+
부천임군 님께서 1개월 전에 Java에 올린 질문
댓글 쓰기

1개의 답변

0
[답변]JMF를 이용하여 RTP를 전송하려고 합니다. (/p/java)
https://karsvv.tistory.com/21 위 링크에 설명되어 있는 아래 내용을 확인해보시기 바랍니다. 데이터 소스가 파일이던, 라이브 비디오 던 간에, Source vide..

https://karsvv.tistory.com/21

위 링크에 설명되어 있는 아래 내용을 확인해보시기 바랍니다.

데이터 소스가 파일이던, 라이브 비디오 던 간에, Source video는 JPEG/RTP 포맷으로 변화 되어져야 합니다.  Cinepak, RGB, YUV and JPEG 등이 잘 이용되는 데이터 소스 타입입니다. processor의 내부적인 제약 때문에 다른 포맷에서는 문제가 발생 할 수 있습니다.
또한 입력되는 영상의 크기는 가로 세로의 크기가 8 x 8 사이즈의 정수배가 되어야 합니다. 이 제약사항은 정말 바보 같은 제약 사항입니다. JPEG Base Line Codec에서 압축을 할 때 입력 영상을 8x8로 쪼개어서 블럭별 압축을 수행하거든요.. 그래서 이런 제약이 나온 것 인데. 하여간 현재 JMF에서는 이렇게 영상 크기가 맞지 않는 경우도 지원하지 못합니다. 320x240, 176x144 등등이 인식되는 영상 크기 구요,  240x180, 90x60 등으로 영상 크기를 지정하는 경우 인식하지 못하는 문제점이 발생합니다.  

1개월 전 kimho
+
kimho 님께서 1개월 전에 Java에 올린 글
사탕 주기
보관하기
댓글 쓰기
조회수 654
답글 1
URL