I realize the SDK isn’t even out yet, but is it even feasible?
I’m talking about directly calling (or indirectly through a C interface library) the VisionOS API so we can at least experiment with realtime programming rather than go through the typical compile, install, run, test cycle that XCode requires.
Yes I know that they give 3 static living rooms to look at your code in, but there’s no mention of seeing how look and feel in a live environment, which is a glaring oversight on Apple’s part, and a golden opportunity for Squeak if it is doable.
L
Hi Lawson --
I don't know about the Vision Pro, but our group did some experiments with Squeak+Morphic+Godot+VR :-)
- https://github.com/hpi-swa-lab/pivr - "Live Programming over TCP? Bringing Squeak/Smalltalk Liveness to Godot via React" https://www.youtube.com/watch?v=six50N8smq8 - "Toward a VR-Native Live Programming Environment" (OpenAccess) https://dl.acm.org/doi/10.1145/3563836.3568725 https://dl.acm.org/doi/pdf/10.1145/3563836.3568725
Best, Marcel Am 16.06.2023 05:30:12 schrieb LawsonEnglish lenglish5@cox.net: I realize the SDK isn’t even out yet, but is it even feasible?
I’m talking about directly calling (or indirectly through a C interface library) the VisionOS API so we can at least experiment with realtime programming rather than go through the typical compile, install, run, test cycle that XCode requires.
Yes I know that they give 3 static living rooms to look at your code in, but there’s no mention of seeing how look and feel in a live environment, which is a glaring oversight on Apple’s part, and a golden opportunity for Squeak if it is doable.
L
squeak-dev@lists.squeakfoundation.org