I usually stream and use a capture card to do it. But when the cable goes through the capture card it loses resolution and frames. So I have 2 options and need advice on any of them:
Use an hdmi splitter. Do I need a 4k 2.1 specific splitter or can I use any hdmi splitter? Do the other cables need to be hdmi 2.1 too for me not to lose quality or performance?
Output image directly from spectrum to mu laptop as a video capture input (without a capture card). Is there a way to do this?
Ideally, the quality of the video signal going out of the capture card depends a lot on the card itself, and the specs of the cables. If these checks out, then all we have to do is to find the right settings.
My advice on both options:
This goes back to my point above. I would assume if you use a splitter, then you will use Spectrum as your gaming screen and an alternative screen hooked up to your streaming PC / laptop?
Keep in mind as well that some splitters act as “mirror”, meaning that it will keep the lower end of the highest supported resolution on both devices. Better splitters can keep the highest possible resolution for both even if they are different.
As far as I am concerned, this is not possible. Since Spectrum lacks the video output capability (except for model 1, which is specifically for daisy-chaining)
Thank you for your answer @Cas. So my current setup is:
Xbox Series X connected via HDMI 2.1 to:
— Elgato HD60s Capture Card connected both to:
—— Eve Spectrum vi HDMI (where I play)
—— HP Laptop (where I capture and stream vis OBS Studio)
Does that makes sense?
Of course when I connect it that way I loose the option to play either at 120fps or 4k. So what I conclude from your message is that I should buy a 4K+ splitter at least 1 HDMI 2.1 cable more right?
Nice! I really appreciate it. Just one doubt, why is it that I can’t use an hdmi 2.1 for Spectrum and an hdmi 2.0 for the Elgato? Is it because of the mirror effect you said?
Nope. Mirror effect is due to using cheaper HDMI splitter.
For your proposed setup, it varies case to case.
Usually this is to prevent any possible problems as using different cables may result in different signals being transmitted.
Not to mention as well that the cables with different brands have different quality, etc. Having all cables at least under same standard (i.e. HDMI 2.1) minimises the possibility of error. Cables now are complicated business (and don’t get started on USB-C)
But of course, if you have a spare cable lying around you can try using 2.0 for Elgato and see if it works for you.
If I have a friend whose monitor is just 1080p and 60fps with an Xbox Series X… is it the same for him to use a hdmi 2.0 or would I be taking advantage of him if he gave me his Xbox Series X included cable (hdmi 2.1)?