Jump to content

Our Studio was a huge waste of money - Gamelinked Launch

AdamFromLTT

Making a Gaming news channel is easy. The hard part is finding some space to shoot it. Despite buying a separate building for our Lab, we have precious little space for more sets. WE got Short Circuit, Mac Address, Techlinked, Tech Quickie and Linus Tech Tips competing for space, it is getting cramped in here. But with the magic of green screen, we can transport ourselves anywhere. But with the BlackMagic Ultimatte 12, we can also do it live instead of in post.

Subscripe to GameLinked: https://www.youtube.com/watch?v=EzO6sC_ukqk

 

Link to comment
Share on other sites

Link to post
Share on other sites

At this point LTT have made so many other channels that I think they should make a whole other platform for them (YouTube 2 anyone?)

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

1 minute ago, LMGcommunity said:

May I introduce you to Floatplane?

May I introduce you to our sponsor! Glasswire!

Message me on discord (bread8669) for more help 

 

Current parts list

CPU: R5 5600 CPU Cooler: Stock

Mobo: Asrock B550M-ITX/ac

RAM: Vengeance LPX 2x8GB 3200mhz Cl16

SSD: P5 Plus 500GB Secondary SSD: Kingston A400 960GB

GPU: MSI RTX 3060 Gaming X

Fans: 1x Noctua NF-P12 Redux, 1x Arctic P12, 1x Corsair LL120

PSU: NZXT SP-650M SFX-L PSU from H1

Monitor: Samsung WQHD 34 inch and 43 inch TV

Mouse: Logitech G203

Keyboard: Rii membrane keyboard

 

 

 

 

 

 

 

 

 


 

 

 

 

 

 

Damn this space can fit a 4090 (just kidding)

Link to comment
Share on other sites

Link to post
Share on other sites

Just now, filpo said:

May I introduce you to our sponsor! Glasswire!

and ridge wallet, do you put your wallet in your #@?!§%& pocket. Then this #@?!§%& wallet is for you. Because why carry a comfortable leather or nylon, or silk or... wallet when you can have a hard #@?!§%& metal #@?!§%& chunk in your pocket with technobable from this  add. So please click so we can keep  plugging lttstore dot com and other #@?!§%& through the show as well as the #@?!§%& you have no interest in. and niether does anyone sane. 

Link to comment
Share on other sites

Link to post
Share on other sites

and our other sponsor #@?!§%& VPN! Why use a real VPN that might work for you for a about the price of a 12 pack of soda, when you can use a 'free'(only during a janky trial) vpn like #@?!§%&VPN.

Link to comment
Share on other sites

Link to post
Share on other sites

The green screen and tracking technology is pretty cool, and it's neat that it's become relatively attainable. At the same time it's pretty obvious it's a green screen, both here and in the Gamelinked video. I understand a media company will always weigh the price against the performance, which includes the workflow and time spent, but a real life set is clearly still superior. It'll be cool to see where LTT can take it once they get comfortable with the technology and workflow, though.

Link to comment
Share on other sites

Link to post
Share on other sites

If this new channel is something that you wanted to see from LMG, may I humbly suggest Skill Up's "This week in video games". He always makes his show with the perfect amount of humour, in-jokes and relevant gaming and tech-related news.

I'm interested to see how these two shows will differ 🙂

Link to comment
Share on other sites

Link to post
Share on other sites

I actually have some experience with the kind of professional broadcast virtual set solutions Linus touched on.

 

Camera tracking can be done with a rig like Mo-Sys StarTracker. Instead of QR code plates (which could be seen in AR applications), that system uses a constellation of retroreflective stickers placed in an irregular pattern at different heights all over the ceiling. (For example, on a lighting grid, ductwork, and the physical ceiling of a studio space.) A camera module positioned above the camera's sensor flashes pulsed IR at the ceiling, and monitors the constellation overhead. By plotting the stickers it sees against a known map, and watching the parallax scrolling of the different layers, it can work out where in physical space the camera is and what angle it's pointed at. A serial cable between the lens grip and processor box adds your lens data (zoom, focus, iris, etc), then all those coordinates are shoved down the network to the character generator (graphics PC).

 

A graphics system like Ross Voyager or Viz Virtual Studio uses the coordinates Mo-Sys is vomiting at it to place a virtual camera in the virtual set. If you're "just" doing AR, it will only render the elements that have to be overlaid onto the real video. I think it renders two keys if the presenter is on a green screen and has to be sandwiched in between graphic elements "in front of" and "behind" them. The fill (which you'll see) and key (black-and-white knockout) get passed along to a video switcher, which keys the layers in as necessary. 

 

All that gets done in real time at 59.94 FPS. Then it's down the air path to get closed captioning, Nielsen codes, encoded and compressed six ways from Sunday, maybe a little PSIP data for channel guides, then beamed over the air and/or down a wire or fiber to your TV where it gets turned into photons that get blasted straight into your eyeballs.

 

It's amazing how capable BlackMagic/Behringer class video equipment has gotten, and at very approachable price points. The professional stuff is still very expensive and kind of finicky to set up, but when the stars align and it all works right...

 

image.png.8cd75638f4c95339837beb914949bc30.png

 

Of course, the real big brain TV production move is to build a huge, seamless LED wall for normal graphics and video, then fill it with green when your normal green screen isn't big enough for a special production.

 

42 minutes ago, XNOR said:

The green screen and tracking technology is pretty cool, and it's neat that it's become relatively attainable. At the same time it's pretty obvious it's a green screen, both here and in the Gamelinked video. I understand a media company will always weigh the price against the performance, which includes the workflow and time spent, but a real life set is clearly still superior. It'll be cool to see where LTT can take it once they get comfortable with the technology and workflow, though.

You have to get your lighting just right to make a natural-looking key on a virtual set. That can be trickier than it sounds. Not only do you have your normal green screen lighting issues to contend with (staying far enough away to not get a green glow from light bouncing of the wall, appropriate key lighting on your host, lighting them from above and behind...), but the physical set lighting has to jive with the virtual lighting in the virtual set.

 

It's easy to goof that part up and end up with a shot that looks jarring and unnatural. (And in my opinion, there's kind of an uncanny valley effect to lighting, where the closer to perfect you get the more the flaws stand out.)

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

Riley: meticulously waiting to work out all these details...

Editor: Today is June 28!

Link to comment
Share on other sites

Link to post
Share on other sites

27 minutes ago, XNOR said:

The green screen and tracking technology is pretty cool, and it's neat that it's become relatively attainable. At the same time it's pretty obvious it's a green screen, both here and in the Gamelinked video. I understand a media company will always weigh the price against the performance, which includes the workflow and time spent, but a real life set is clearly still superior. It'll be cool to see where LTT can take it once they get comfortable with the technology and workflow, though.

We're still working on ironing out the kinks to make it more convincing. With any new hardware and production process, there is a lot to learn.

Using the greenscreen for Techlinked, it works excellent! But we notice artifacts where the machine struggles to key out the blur that's naturally captured by the camera. We're only gonna get better at it!

Link to comment
Share on other sites

Link to post
Share on other sites

Congrats on the launch. 

If you are going to make talklinked ish videos it could be cool if you invite external journalists in to discuss specific topics. 

Like Yathzee Croshaw forexample. Dunno if possible, but it could be cool. 

mITX is awesome! I regret nothing (apart from when picking parts or have to do maintainance *cough*cough*)

Link to comment
Share on other sites

Link to post
Share on other sites

21 minutes ago, AdamFromLTT said:

Using the greenscreen for Techlinked, it works excellent! But we notice artifacts where the machine struggles to key out the blur that's naturally captured by the camera. We're only gonna get better at it!

Is the virtual TechLinked set a 3D recreation, or is it a photo of the physical set? (I couldn't tell, since the camera is stationary for TechLinked.)

 

I think it would be worthwhile to make a full virtual TechLinked set using all the tech GameLinked will use, because that would give you a known end result to shoot for. Then you can apply that experience to lighting for the GameLinked virtual set.

 

Virtual WAN Show when? 

I sold my soul for ProSupport.

Link to comment
Share on other sites

Link to post
Share on other sites

Okay, first one that full Vive Mars tracking kit is awesome but it's an overkill. It's like buying the whole 32 camera OptiTrack system to get single actor mocaps while you could do that with just the 8 camera setup and end up with the same results.

 

What you need is:
Vive/Vive Pro/whatever headset that comes with the OpenVR lighthouse set and can support Vive trackers and OpenVR (SteamVR)

1 Vive tracker (no matter the generation) with the dongle or 3rd controller and the Vive tracker dongle (to connect the 3rd controller to the PC, as is you can only connect 2 controllers to one headset at the time)

 

Vive tracker is the easier solution because it has the standard 1/4" screw in the bottom and you can get like steel angle bracket or something sturdy to attach the tracker to your camera rig. With 3rd controller, well duct tape is silvery.

 

I am not sure does this work with Unreal 5, but Unity branch of OpenVR is more than happy to provide you automatic "quartered views" mode when it notices 3 controllers (if you went with the Vive Tracker, happy days, you get to say goodbye to the warranty and flash the tracker with controller firmware). Now this is where all the magic happens, you have 4 views which you want to slap around the OBS:

1. Headset View (the normal screen output from the left eye of the headset)

2. Background (everything inside the "game" BEHIND the headset from the camera)

3. Foreground (everything between the headset and the camera)

4. Foreground alpha.

They will look like crab because you haven't set up the SteamVR configuration yet. This is a bit tedious part because not only do you need to manually insert the offset of the camera rig (as in the 3rd controller/tracker, you may want to keep a track on that because it needs to be the last one to be connected to the SteamVR each time). This offset is basicly the position and rotation difference of the physical controller/tracker to the actual camera, the default position is as the camera is looking through hole in the Vive controller and you want that to be however your camera controller/tracker is attached to your camera. Second part is to set the FoV and distortions of the in-game camera lens to match the actual camera lens, this way you get 1:1 movement and scale.

 

Now you just need greenscreen, green cube or whatever, set the headset at the person and start shooting like normal except you match the screens from the quartered views to the OBS correctly and out comes realtime MR footage. You can also do all the fancy stuff like route the quartered view footage to the Blackmagic black box and all that, throw the greenscreen out of the window and have stuff flying around the studio by just taking the foreground footage and culling it with the foreground alpha... With some developing you can also do much more within the game engine as long as it allows multiple screens (as in image outputs, rendered footage to screens) so you can do the quartered views by yourself or something completely different (like culling the background from the quartered footage and sending the now stripped in-game footage to a phone or whatever screen on the camera rig so the camera operator can see what they are filming within the game engine).

 

Of course with internal development (or at least knowing Unity) you can ditch all that, get some ping-pong balls and OptiTrack system and now you can basicly play to be Neill Blomkamp while making the Oats Studios stuff and have multiple persons digitally tracked, freely moving "digital" camera tracked, random objects tracked and all that so much in real time that you can have the "digital" camera operator have a tablet in their hands instead of a camera and see the in engine footage they are shooting.

And yeah, Oats Studios Adam is shot with Unity, OptiTrack and facial tracking. Although with the difference that they used stand-in props and graphics within the engine during the filming and recorded all the movement data and then later rendered the final footage from that. And that was 5-6 years ago.

Link to comment
Share on other sites

Link to post
Share on other sites

best thing to do.

dont let coltan near it!!!

MSI x399 sli plus  | AMD theardripper 2990wx all core 3ghz lock |Thermaltake flo ring 360 | EVGA 2080, Zotac 2080 |Gskill Ripjaws 128GB 3000 MHz | Corsair RM1200i |150tb | Asus tuff gaming mid tower| 10gb NIC

Link to comment
Share on other sites

Link to post
Share on other sites

i cant wait to see what the community does with all these clips of Linus on a green screen 🤣🤣

Link to comment
Share on other sites

Link to post
Share on other sites

I’m currently studying film at a small Aussie uni and looking into using that iPhone tracking solution for making post VFX way way way easier, do you have any info on exactly how you guys specifically implemented it for post production? Always been interested in virtual production/post VFX but didn’t realise how accessible it was with just a main cam and iPhone. I assume I have to measure the distance between the iPhone and the sensor to get the right offset but what app should I use for the movement tracking?

Link to comment
Share on other sites

Link to post
Share on other sites

20 hours ago, Needfuldoer said:

You have to get your lighting just right to make a natural-looking key on a virtual set. That can be trickier than it sounds. Not only do you have your normal green screen lighting issues to contend with (staying far enough away to not get a green glow from light bouncing of the wall, appropriate key lighting on your host, lighting them from above and behind...), but the physical set lighting has to jive with the virtual lighting in the virtual set.

 

It's easy to goof that part up and end up with a shot that looks jarring and unnatural. (And in my opinion, there's kind of an uncanny valley effect to lighting, where the closer to perfect you get the more the flaws stand out.)

That's definitely true, and uncanny valley was pretty much the term I was thinking off too. The tracking also feels subtly off. I'm not sure whether the update frequency of the tracking is slightly different from the real footage, or perhaps the tracking isn't quite accurate enough, but the real footage and the virtual studio feel somewhat disjointed pretty much right from the start.

Link to comment
Share on other sites

Link to post
Share on other sites

18 hours ago, ToboRobot said:

How long before Linus has a studio at home?

Considering how many videos are shot at his place, I'd say he already does.

Link to comment
Share on other sites

Link to post
Share on other sites

14 minutes ago, XNOR said:

Considering how many videos are shot at his place, I'd say he already does.

Sorry, I was thinking with the green screen so that he doesn't actually need to go into "the office" to shoot, and can work from home more. 

Link to comment
Share on other sites

Link to post
Share on other sites

This was fun for me to see.  I didn't know Ultimatte was still a thing - back in the 00's when I worked in Los Angeles, it was the dominant keyer (thankfully I work in VFX and stay well away from anything to do with pulling keys, myself.)

I wonder how long it will be before AI-based keying tools supplant it, though.

Link to comment
Share on other sites

Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

×