Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Multi External Camera #121

Open
milazki opened this issue Jun 26, 2024 · 14 comments
Open

Multi External Camera #121

milazki opened this issue Jun 26, 2024 · 14 comments

Comments

@milazki
Copy link

milazki commented Jun 26, 2024

Hi, Pedro!
I recently studied the work of Android and external camera, I found the feature/extra-video-source branch from you. I managed to use it to output video from my usb camera.

In addition, I got acquainted with the issue and found this: #110 . In it, a ronaldsampaio implemented his openCamera function for CameraClient for the AndroidUSB library (#110 (comment) ), which helped solve the problem with video output.

I have a question, is there any way to simultaneously output all the cameras connected to an Android device (for example, two cameras connected via a hub), and stream everything at once? Do I need to implement my own openCamera with SurfaceTexture for MultiCamera?
Thanks

@dixtdf
Copy link

dixtdf commented Jun 27, 2024

Here's an example, but it's a single USB camera, https://github.com/pedroSG94/USBStreaming, I'm also dealing with multiplay today, as the lack of documentation is a big headache!

@pedroSG94
Copy link
Owner

Hello,

Do you mean use this class?:
https://github.com/ronaldsampaio/AndroidUSBCamera/blob/master/libausbc/src/main/java/com/jiangdg/ausbc/MultiCameraClient.kt

I will need check it but I haven't multiple cameras so I can't fully test it. I only can test open one camera using that class.
Anyway, don't expect so much. It is a third party library, I can't give you support for it

@dixtdf
Copy link

dixtdf commented Jun 27, 2024

Currently, I'm browsing other libraries https://github.com/shiyinghan/UVCAndroid, it can play multiple cameras properly and I can use TextureView or SurfaceView as a container, but when using TextureView or SurfaceView as an RTSP Server screen, I have a problem, the rtsp:// is always black!

@pedroSG94 Can you look at the problem for me?

1719476298676

                TextureView textureView = new TextureView(getContext());

                textureView.setSurfaceTextureListener(new TextureView.SurfaceTextureListener() {
                    @Override
                    public void onSurfaceTextureAvailable(@NonNull SurfaceTexture surface, int width, int height) {
                        streamPlayerSurfaceViewUsbCameraHelper.startPreview();
                        streamPlayerSurfaceViewUsbCameraHelper.addSurface(surface, false);

                        RtspServerStream rtspServerStream = new RtspServerStream(getContext(), 17554, new ConnectCheckerEvent() {
                            @Override
                            public void onStreamEvent(StreamEvent streamEvent, String s) {
                                Log.d(TAG, s);
                            }
                        });
                        rtspServerStream.getGlInterface().setAspectRatioMode(AspectRatioMode.Fill);

                        Size previewSize = streamPlayerSurfaceViewUsbCameraHelper.getPreviewSize();
                        boolean prepareVideo = rtspServerStream.prepareVideo(previewSize.width, previewSize.height, 2000 * 1024, 25, 0, 90);
                        boolean prepareAudio = rtspServerStream.prepareAudio(48000, false, 128000);
                        rtspServerStream.startStream();
                        rtspServerStream.startPreview(textureView);
                    }
                });

@pedroSG94
Copy link
Owner

pedroSG94 commented Jun 27, 2024

I can't help you like that. I need a full example code to test it.
For now, I can tell you that you are not understanding the way StreamBase and VideoSource works.
Let me explain a bit about the way StreamBase render surfaces.

You have an internal surface that is the surface that you need render with an external api (camerax, usb library, etc). This surface receive frames from that apis and it is used internally to work. You can access to that surface using a VideoSource in start method like here:
https://github.com/pedroSG94/RootEncoder/blob/master/app/src/main/java/com/pedro/streamer/rotation/CameraXSource.kt#L72

startPreview method is only a method that you use to show the result of the render in the library. Basically you copy the stream result in that surface. That surface is not used to read frame, it is used to write frames provided by the surface of VideoSource.

So you need create a new VideoSource like CameraXSource and use your streamPlayerSurfaceViewUsbCameraHelper as a source replacing camerax.
You need set the surface provided in start method of VideoSource to streamPlayerSurfaceViewUsbCameraHelper.addSurface and then show the result in a preview with startPreview using the textureView

@dixtdf
Copy link

dixtdf commented Jun 27, 2024

@pedroSG94 Thanks for the reply, I realize the problem have managed to get it running but the screen seems to have black edges and no matter how I set the resolution the screen is always vertical, I'm trying to figure out how to deal with it!

It's worth noting that isRunning must be false the first time it's called, otherwise it won't be able to enter the startup state, i.e. it won't be able to render the SurfaceTexture of the video source.

                RtspServerStream rtspServerStream = new RtspServerStream(getContext(), 17554, new ConnectCheckerEvent() {
                    @Override
                    public void onStreamEvent(StreamEvent streamEvent, String s) {
                        Log.d(TAG, s);
                    }
                }, new VideoSource() {
                    @Override
                    protected boolean create(int i, int i1, int i2, int i3) {
                        return true;
                    }

                    @Override
                    public void start(@NonNull SurfaceTexture surfaceTexture) {
                        streamPlayerSurfaceViewUsbCameraHelper.addSurface(surfaceTexture, false);
                    }

                    @Override
                    public void stop() {

                    }

                    @Override
                    public void release() {

                    }

                    @Override
                    public boolean isRunning() {
                        return false;
                    }
                }, new MicrophoneSource());
                GlStreamInterface glInterface = rtspServerStream.getGlInterface();
                glInterface.setAspectRatioMode(AspectRatioMode.Fill);
                glInterface.setAutoHandleOrientation(false);

                Size previewSize = streamPlayerSurfaceViewUsbCameraHelper.getPreviewSize();
                boolean prepareVideo = rtspServerStream.prepareVideo(1080, 1920, 6000 * 1024, 25, 0, 90);
                boolean prepareAudio = rtspServerStream.prepareAudio(48000, false, 128000);
                rtspServerStream.startStream();

@pedroSG94
Copy link
Owner

Try to add this line before startStream:

rtspServerStream.getGlInterface().forceOrientation(OrientationForced.LANDSCAPE);

If the problem is not solved, share me a photo of the result in both cases (with the line and without the line)

@pedroSG94
Copy link
Owner

pedroSG94 commented Jun 27, 2024

About running value. I recommend you create a boolean value and change the state in the start/stop methods. Also, if you have a method to remove the surface from streamPlayerSurfaceViewUsbCameraHelper use it in stop method to desattach the surface.

@dixtdf
Copy link

dixtdf commented Jun 27, 2024

Try to add this line before startStream:

rtspServerStream.getGlInterface().forceOrientation(OrientationForced.LANDSCAPE);

If the problem is not solved, share me a photo of the result in both cases (with the line and without the line)

@pedroSG94 Thank you for your answers, I have managed to achieve the results I wanted and your answers have been very helpful, thanks again!

@pedroSG94
Copy link
Owner

pedroSG94 commented Jun 27, 2024

@pedroSG94 Thank you for your answers, I have managed to achieve the results I wanted and your answers have been very helpful, thanks again!

Great! I'm glad that you solved the problem.
If you can share a code example It will be helpful for others like @milazki

@dixtdf
Copy link

dixtdf commented Jun 27, 2024

I will, at a later date I will create a sample

@milazki
Copy link
Author

milazki commented Jun 27, 2024

@dixtdf Thank you very much, I would be very grateful for an example. The only thing I managed to do was switch between the connected cameras. Now I'm trying to find a solution so that all cameras are displayed on the screen at the same time

@dixtdf
Copy link

dixtdf commented Jun 28, 2024

@milazki Sorry I'm a bit busy with work, the full example will need to be later, this is a simple example, maybe you can try it, when a device is inserted StateCallback will automatically go to onAttach, I believe this part of the code can help you. CameraHelper tries to be as globally unique as possible, frequent camera switching may lead to crashes

I've been debugging forceOrientation and prepareVideo(rotation=90) since it was really late yesterday, they feel a bit strange, maybe it's the orientation of my usb camera that's wrong, but it basically does what I want, can't ask for too much!

        usbCameraHelper = new CameraHelper();
        usbCameraHelper.setStateCallback(new ICameraHelper.StateCallback() {
            @Override
            public void onAttach(UsbDevice device) {
                UsbManager usbManager = (UsbManager) getSystemService(Context.USB_SERVICE);
                PendingIntent permissionIntent = PendingIntent.getBroadcast(getContext(), 0, new Intent(ACTION_USB_PERMISSION), 0);
                usbManager.requestPermission(device, permissionIntent);
            }

            @Override
            public void onDeviceOpen(UsbDevice device, boolean isFirstOpen) {
                usbCameraHelper.openCamera();
            }

            @Override
            public void onCameraOpen(UsbDevice device) {
                usbCameraHelper.startPreview();
                String deviceName = StringUtils.substring(device.getDeviceName(), StringUtils.lastIndexOf(device.getDeviceName(), "/") + 1);
                RtspServerStream rtspServerStream = new RtspServerStream(getContext(), 10554 + Integer.valueOf(deviceName), new ConnectCheckerEvent() {
                    @Override
                    public void onStreamEvent(StreamEvent streamEvent, String s) {
                        Log.d(TAG, s);
                    }
                }, new VideoSource() {
                    @Override
                    protected boolean create(int i, int i1, int i2, int i3) {
                        return true;
                    }

                    @Override
                    public void start(@NonNull SurfaceTexture surfaceTexture) {
                        usbCameraHelper.addSurface(surfaceTexture, false);
                    }

                    @Override
                    public void stop() {

                    }

                    @Override
                    public void release() {

                    }

                    @Override
                    public boolean isRunning() {
                        return false;
                    }
                }, new MicrophoneSource());
                GlStreamInterface glInterface = rtspServerStream.getGlInterface();
                glInterface.setAspectRatioMode(AspectRatioMode.Adjust);
                glInterface.forceOrientation(OrientationForced.NONE);

                Size previewSize = usbCameraHelper.getPreviewSize();
                boolean prepareVideo = rtspServerStream.prepareVideo(1920, 1080, 6000 * 1024, 25, 0, 90);
                boolean prepareAudio = rtspServerStream.prepareAudio(48000, false, 128000);
                if (prepareVideo && prepareAudio) {
                    rtspServerStream.startStream();
                }
            }
        });

@milazki
Copy link
Author

milazki commented Jun 28, 2024

@dixtdf Thanks for the hint, but unfortunately I didn't succeed. I would be glad to see your example of how you will be able to solve this problem

@dixtdf
Copy link

dixtdf commented Jun 29, 2024

My approach for multiple USB playback is to automatically create an RTSP service when a USB is inserted, and then play the video directly using the RTSP address. This can avoid frequent switching of the camera, although it may consume more memory. The VideoSource is the most important part in the example I provided. Once you learn how to write data to the SurfaceTexture of VideoSource, you will have succeeded. My suggestion is that you can avoid using CameraHelper. You can try rendering the USB video to a regular view first, and then try creating an RTSP service if it works.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants