[squeak-dev] Installing high dpi support and backwards compatibility of the VM

Tobias Pape Das.Linux at gmx.de
Fri Oct 9 08:24:40 UTC 2020


Hi

> On 09.10.2020, at 09:58, Eliot Miranda <eliot.miranda at gmail.com> wrote:
> 
> Hi Tobias,
> 
> On Thu, Oct 8, 2020 at 11:57 PM Tobias Pape <Das.Linux at gmx.de> wrote:
> Hi
> 
> > On 08.10.2020, at 19:58, Eliot Miranda <eliot.miranda at gmail.com> wrote:
> > 
> > Hi All,
> > 
> >     ideally adding the high dpi support to the VM will not break backwards-compatibility.  But that implies that the VM is informed before it creates the display of whether to create a high dpi display or not.  Off-list Tobias asked me where the VM sets up the display on Mac and I was surprised by the answer.
> > 
> > I thought it would be as part of beDisplay.  But it's actually as a side-effect of DisplayScreen class>>actualScreenSize, primitive 106, which calls the ioScreenSize function.  It is this functions' responsibility to actually create the display, deriving the size from the savedWindowSize info in the image header (which can or could be overridden on the VM command line, and is when -headless is supplied).
> > 
> > So any new primitive added to allow DisplayScreen to inform the VM of whether to use high dpi or not would have to be invoked before primitive 106.  So one way to implement this is to modify the chain of invocations leading up to primitive 106.  For this route I'd like to propose the following refactoring:
> > 
> > DisplayScreen class>>actualScreenSize
> >       <primitive: 106>
> >       ^ 640 at 480
> > 
> > becomes
> > 
> > DisplayScreen class>>actualScreenSize
> >       self primitiveUseHighDPI: self useHighDPI. "where this is a preference"
> >       ^self primitiveScreenSize
> > 
> > primitiveScreenSize
> >       <primitive: 106>
> >       ^ 640 at 480
> > 
> 
> 
> Here's another idea:
> We already have
> 
> DisplayScreen class>>actualScreenScaleFactor
>         <primitive: 'primitiveScreenScaleFactor'>
>         ^ 1.0
> 
> And if we change DisplayScreen class>>startUp  to
> 
> DisplayScreen class>>startUp  "DisplayScreen startUp"
>         Display setScaleFactor: self actualScreenScaleFactor.
>         Display setExtent: self actualScreenSize depth: Display nativeDepth.
>         Display beDisplay
> 
> Very nice.  Let's go with this.
> 
> Then the contract could be:
> 
> "Iff you call primitiveScreenScaleFactor before any call to primitive 106, then you opt in to possibly high dpi"
> 
> That way, we do not have to change any image at all, cause older images just don't call that primitive.
> 
> Yep, works for me.


noice.

> 
> 
> > 
> > Another route is to make the useHighDPI flag part of the image header state alongside the saved window size.  This would mean it was added to the flags accessed via vmParameterAt: 48.  There could be a command-line argument to override.
> 
> Maybe a cmd-line parameter in any caseā€¦
> 
> +1
> 
> > 
> > 
> > Finally I note that the beDisplay primitive simply stores the display object in the specialObjectsArray and assigns the interpreter variables that track the display, displayBits, displayWidth, displayHeight & displayDepth.  It then invokes ioNoteDisplayChangedwidthheightdepth, but *all* the implementations of this function are empty.  I propose that we should eliminate 5this call and its implementation. It is confusing to follow it and find it does nothing.  The argument could be that a platform might require it.  But if that's so we can always put it back.  We have an existence proof in all our platforms that this is unlikely.  Thoughts?
> 
> 
> Funny. The mac vm says "/* This is invoked when the GC moves the display bitmap.  For now do nothing. */" Does the GC ever do that actually?
> 
> Not now.  It used to.  One would see the pixels in the display become nonsense noise as the GC moved the display underneath the screen refresh.  But now in Spur the beDisplay primitive pins the display bits.
> It's also fairly recent:
> 
> 78c402ea71ebcc9db12496f81021fdb9b57deb5f (Fri May 12 19:29:45 2017)
> 
> StackInterpreter:
>     Simplify and make robust display bitmap access for display update.  The old code
>     required platforms that needed to redraw at arbitrary times to have to access
>     the display bits through interpreterProxy->displayObject, decoding it each time.
>     There exists a small window during compaction, etc, during whiuch such access
>     will fail and cause a VM crash.  The new code provides four variables to
>     reference the display, displayBits, displayWidth, displayHeight and
>     displayDepth, which are assigned appropriately in the primitiveBeDisplay
>     primitive.  After a GC the interpreter checks if the displayBits have changed
>     location and if so calls ioNoteDisplayChanged:width:height:depth:
>     (ioNoteDisplayChangedwidthheightdepth) to inform the platform of the change
>     (currently all platforms implement this as a null function).
> 
> 
> So, old (<2017) code cannot depend on it, new code does not. If the GC issue is moot, we can ditch it.
> 
> Even if the GC does move the display, ioNoteDisplayChangedwidthheightdepth can't be called until after GC, which means that from the time the GC moves the display to the time the GC finishes, the display image is corrupted.  That's fixed in Spur but in V3 you'll see that happen, especially if you resize the display (which allocates a new bitmap).  At least on Mac I would see it regularly.

So we never used it.
The (non-stack) interpreter vm does not have any of this code,
so only the stack interpreter is affected?

-t

> 
> 
> Best regards
>         -Tobias





More information about the Squeak-dev mailing list