You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the Inference issues and found no similar feature requests.
Question
Hello everyone,
I'm very interested in Roboflow Inference. I'm wondering, with the current InferencePipeline, is it possible to add, edit, and delete the number of inputs for just one pipeline?
For example, AI Model has a max_batch_size of 8
Initially I initialize:
pipeline = InferencePipeline(...) with only one RTSP stream
pipeline.start()
Can I add an RTSP stream to the pipeline running above? For example:
pipeline.add_stream("rtsp://...")
Does adding not affect previously running threads?
Can anyone help me answer if InferencePipeline currently has the ability to do that?
And in your opinion, add_stream to a running pipeline is better than having multiple copies of the pipeline running at the same time?
Thank!
Additional
No response
The text was updated successfully, but these errors were encountered:
Hello there,
That's interesting feature, yet not supported at the moment.
Within the current structure of code that feature should be possible to be added - as we already dynamically manage steams on separate threads.
We will add that to our backlog, but we also have a lot of things we handle at the moment. That could be shipped faster if you decided to contribute - but that's totally up to you.
Search before asking
Question
Hello everyone,
I'm very interested in Roboflow Inference. I'm wondering, with the current InferencePipeline, is it possible to add, edit, and delete the number of inputs for just one pipeline?
For example, AI Model has a max_batch_size of 8
Initially I initialize:
pipeline = InferencePipeline(...) with only one RTSP stream
pipeline.start()
Can I add an RTSP stream to the pipeline running above? For example:
pipeline.add_stream("rtsp://...")
Does adding not affect previously running threads?
Can anyone help me answer if InferencePipeline currently has the ability to do that?
And in your opinion, add_stream to a running pipeline is better than having multiple copies of the pipeline running at the same time?
Thank!
Additional
No response
The text was updated successfully, but these errors were encountered: