BlockifyVR (Virtual Reality)#1732
Conversation
|
that looks cool |
sadly i dont think that's possible to stop this. |
|
Does this show the same scratch 3D perspective to both eyes? |
A-frame has stereoscopic rendering built in and I've enabled it in the rendering settings, so no. This fixes some issues with visualizing depth, so the eye offset allows your brain to process the image with more clarity. Even if the matrices aren't used in the renderer, it should properly use the correct eye offset. Why do you ask, just out of curiosity? If you'd like I could add a block that disables or enables this feature, in case the Scratcher wants to build that framework themselves, which I'd doubt. Keep in mind this is still a work in progress and most of the code is many months old so it's not the most efficient way to do VR. |
Yes. At one point I tried to extend my AR extension to support AR headsets and VR as well, but the multiple eye rendering has been the main thing holding me back. I was just curious to know how you solved it. |
I believe that there can be two XRViewerPose matrices for each eye if available, according to Mozilla Docs. You might want to look into that again. It'd be great if Augmented Reality could support AR headsets such as the Oculus Quest 3 in addition to just mobile touchscreen devices. There should be a views array that can do this. An example can be seen here. Just a suggestion though. I simply think having AR headsets would be a nice update. |
This comment was marked as resolved.
This comment was marked as resolved.
|
I'll take a look when I have time. Thought it will probably be hard because the only kind of VR I have access to is phone-based 3DOF no controller VR. Also, do you mind giving some links, so that I can faster figure out where to look? |
This comment was marked as resolved.
This comment was marked as resolved.
As for VR testing, you wouldn't really need a VR headset. I've used Meta Quest's Immersive Web Emulator extension for testing when I didn't care to load up the entire headset. |
|
ah yes, I remember seeing a trailer for this on scratch. I wanted to help, but now I can't |
|
nvm I think I can |
I took a look and have no idea. It's all code specific to AFRAME. I don't know AFRAME. AFRAME is too big and complicated for me to try to debug, especially if the only tool in my disposal is WebXR API Emulator browser extension. |
|
Nic3 |
|
Nice* |
then how to use it ??? |
|
Hey, have u tried it on a vr |
Thebloxers998
left a comment
There was a problem hiding this comment.
If it's not buggy anymore then you should try to get it to turbowarp
Hopefully it gets added |
I've determined that this is incorrect and outdated information from a previous period of ignorance about how A-frame works @Xeltalliv the answer is yesn't. Yes and no. It's fairly complicated: A-frame has no way of knowing what kind of 3D engine the scratch project uses, so there's no true stereoscopy; the scratch stage only renders one view. However, the entire scratch stage is mirrored into A-frame at real time, and the A-frame renderer uses stereoscopy. This makes the flat texture on the flat plane have a very convincing illusion of depth, but it's "fake" stereoscopy. I was considering adding capabilities to the existing blocks to enable/disable A-frame's stereoscopy settings and return the projection/transformation matrices for the left and right eyes instead of the generalized camera, but quickly realized it would be unfeasible considering the ambiguity of the situation—how would it handle headsets with one display per eye (as with some modern headsets) rather than the traditional one-display two-lenses approach? How would a-frame use each project's unique stereoscopy implementation to render the correct view? Would it always be split down the middle, or would there be an alternate implementation? Simply plugging the stereoscopy directly into the project and having the project render two views might theoretically result in each eye having two views, which clearly wouldn't work, so I decided against it. Either way, I'm convinced most users would find the built-in stereoscopy solution compatible with any of their projects, easy to use, and convincing enough for most scenarios. While it doesn't handle complex and realistic depth scenarios (like putting your index finger up close to your nose and focusing on a faraway object so that your index finger appears to be seen double), in my testing it's enough that everything appears as expected and there's nothing jarring that would throw users off. |
Progress update and A RequestI've been working on various optimizations and bug fixes throughout the entire extension, along with preparing it for the final release. Over the past few months I've rearranged the extension structure to more closely follow A-frame's best practices, allowing me to fix a breaking issue and update past version 1.5.0 - it's now on the latest, which provides improvements for my workflow. Additionally, Meta/Oculus seems to have fixed an issue in their browser where the controllers wouldn't connect immediately upon entering VR, but only after pressing the system button. I don't know how other browsers on other platforms behave but overall this is an improvement. Yesterday I was working on a change to the design philosophy of display scaling. I added a "resolution scale" block (it only accepts values from 0.1-2.0) and refactored the system in the component to set the size of the scratch stage to the size of the renderer times the resolution scale, instead of dynamically deciding what to do based on what size is already present. This makes the scaling much more predictable, easier to understand, compatible with Augmented Reality, and behaves more as the users might expect—and the resolution scale block allows users to control the balance of quality and performance. @Xeltalliv and @CubesterYT However, there's two tiny issues that need addressing:
and wrapping the entire texture logic in a |
|
Sorry for writing an essay |
|
I've narrowed down the issue, a previous commit in my private repo was working and by cross-referencing with current versions I was able to get the display to appear immediately, but I'm still working on trying to understand why it's happening so I can prevent it in the future. Additionally, after I did that the controllers didn't connect until after pressing the oculus button, so I don't know how to fix that yet but I'm working on patching the last few issues |
|
still need any help? |
Thanks for asking. I haven't been working on my projects recently because of busyness outside of programming but I think I'm close to resolving the final issues. |
well maybe some point after you could make a 2.0 version. assuming you ever feel up to it. |
|
i tried testing this with my new quest3s, and i can't get it to work at all. i'm trying to use steamlink. |
Assuming you got it for Christmas, so if it's brand new it's not a problem with the Quest. I haven't tested that device or steam link, I've only tested standalone on the oculus browser. Which version of the extension are you using, and could you provide more specifics on the issue? |
bought it myself just a couple days ago (christmas eve) so yeah its new. i had just pulled the extension straight from here so its the current version. |
|
going into vr mode seemed to do nothing despite having my headset connected via steamlink or meta horizon link. its just not detected. not sure what to do in this case. i might see if i could add nodejs support to this ext and use nodejs to add support for steamlink and meta horizon link tho |
Have you tried that version of the extension on desktop with the Immersive Web Emulator? Have you attempted to play any audio upon entering VR to see if it works at all? There is a known issue where the display is sometimes black upon entering VR until pressing the Oculus Button, so audio feedback generally reveals if this is the case. |
Audio didn't work, I was using desktop version so couldn't use immersive web emulator, and intend to package to exe anyway so that wouldn't be such a viable option if steamlink and meta horizon link won't work. @David-Orangemoon did manage to do a VR thing for PM at some point though and made it work via steamlink. maybe they can give pointers? |
|
I'll have to check this out. I've mainly been testing on Safari, Chrome, and standalone Oculus Quest 2. I haven't tested PCVR or link but a user at Brackets-Coder/BlockifyVR#5 managed to get it working on PCVR. Sometimes it's as simple as your browser doesn't support WebXR, and if it does, you might have to enable the flag or something. Please let me know if you encounter any more issues so I can further narrow down the issue before merging |
|
Also, as a general update I haven't been programming a lot in the past two months so I'm starting to get back into everything. I'm honestly planning on just fixing the final issues and then merging—features like controller vibration and platform detection can be easily implemented later. Hopefully we can get this done and merged before June |
|
I was using the same version straight from the GitHub, but in the desk-penguin editor. |
Was it the v0.8 version under BlockifyVR/releases or the latest commit on this pull request |
Latest commit |
Okay, I'll have to figure it out |
|
I think I should get back to finishing this up |
|
I should really get back to fixing the last issue... |
|
Is this still alive |
Kind of, I haven't really been working on it much I've had a lot of stuff going on and I've been working on other projects as well, but I plan to fix the last bug soon and open it for review. When, I don't know, but I'm also making a macOS path tracer and a three.js extension and an AI plugin for MuseScore. Probably will get back to it sometime in the next week or month or something but you never know There's literally one bug left and I can't seem to make sense of it. |
BlockifyVR
This is a Virtual Reality extension I've been working on since January 27th of 2023.
The end goal is to have:
Here are some things to note: