I'm not sure if they've mapped every multitouch gesture to the Vision Pro out of the box, but it's something they can and should do in time. There's a lot of potential there.
You could easily have some of the same gestures do double duty as remote inputs on TV interfaces, since it's all context dependent on where your eyes are, and there aren't that many to map. But swipe up down left right to navigate a TV interface would get old I think.
I do actually think they should (I understand developer relations/contract reasons they don't) straight up give you emulators apps can't distinguish from the TV/iPad/iPhone on both MacOS and Vision Pro, and take action against developers who try to artificially block you from using their apps on other devices. There are things that won't work, but most will, and I think letting developers artificially segment it out when it's all basically the same chip now is kind of bullshit.
There's a lot of research demonstrating that external factors have a pretty major impact on criminal behavior (nutrition, socialization, etc during developmental years, as examples). So society plays a role.
If you're interest in reading, Robert Sapolsky's Behave is pretty long and a little heavy, but a great, reasonably broad view of the things that make us tick from a bunch of different lenses. It's tied-ish with Daniel Kahneman's Thinking, Fast and Slow as my favorite non-fiction, and looks more at social factors like the example above. I haven't read Determined yet, and really doubt it's going to convince me not to believe in free will, but his underlying base of knowledge is legit.