[Vm-dev] VM Maker: VMMaker.oscog-eem.1661.mcz

commits at source.squeak.org commits at source.squeak.org
Fri Jan 22 01:14:50 UTC 2016


Eliot Miranda uploaded a new version of VMMaker to project VM Maker:
http://source.squeak.org/VMMaker/VMMaker.oscog-eem.1661.mcz

==================== Summary ====================

Name: VMMaker.oscog-eem.1661
Author: eem
Time: 22 January 2016, 5:13:11.061636 pm
UUID: a34e0ec2-d251-4618-bd67-cef5b3eddcf5
Ancestors: VMMaker.oscog-eem.1660

Cogit IMMUTABILITY cleanups.

Remove the need to map the primitive error code store bytecode by having mapFor:bcpc:performUntil:arg: skip the CallPrimitive & StoreTemp bytecoees if present.  Also avoid the set-up overhead when mapping the first bytecode by moving the first invocation early.

Correct and simplify testBcToMcPcMappingForCogMethod:; is compiled method parameter was never used, and the startpc of a method should be derived from the cogmethod's method obj.

Refactor genImmutableCheck:slotIndex:sourceReg:scratchReg:needRestoreRcvr: to lose its unused popBoolean: keyword.

Implement IMMTABILITY in SimpleStackBasedCogit.  All combinations of SimpleStackBasedCogit x StackToRegisterMappingCogit x IMMTABILITY=true/false work on the reader image in the simulator.  THANKS Clément!!
Add SimpleStackBasedCogit>>putSelfInReceiverResultReg and use it where appropriate.

Fix a comment or three.

Oh we will, we will not be moved.
Oh we have im-mut-a-bil-i-ty
Just by a bit in every object header,
we will not be moved.

[ed: that's pinning, surely]

=============== Diff against VMMaker.oscog-eem.1660 ===============

Item was added:
+ ----- Method: CogObjectRepresentationForSpur>>genImmutableCheck:slotIndex:sourceReg:scratchReg:needRestoreRcvr: (in category 'compile abstract instructions') -----
+ genImmutableCheck: regHoldingObjectMutated slotIndex: index sourceReg: regHoldingValueToStore scratchReg: scratchReg needRestoreRcvr: needRestoreRcvr
+ 	| mutableJump fail |
+ 	<var: #mutableJump type: #'AbstractInstruction *'>
+ 	<var: #fail type: #'AbstractInstruction *'>
+ 	<inline: true>
+ 	<option: #IMMUTABILITY>
+ 	"Trampoline convention: 
+ 	- objectMutated passed in ReceiverResultReg
+ 	- index (unboxed) passed in TempReg
+ 	- valueToStore passed in ClassReg.
+ 	Simulated stack is flushed, but if needRestoreRcvr is true 
+ 	the receiver has to be live after this operation."
+ 	self assert: regHoldingObjectMutated == ReceiverResultReg. 
+ 	self assert: scratchReg == TempReg.
+ 	self assert: regHoldingValueToStore == ClassReg.
+ 	mutableJump := self genJumpMutable: ReceiverResultReg scratchReg: TempReg.
+ 	
+ 	"We reach this code if the object mutated is immutable."
+ 	cogit MoveCq: index R: TempReg.
+ 	"trampoline call and mcpc to bcpc annotation."
+ 	cogit CallRT: ceCannotAssignToWithIndexTrampoline.
+ 	cogit annotateBytecode: cogit Label.
+ 	"restore ReceiverResultReg state if needed, the rest of the state is spilled"
+ 	needRestoreRcvr ifTrue: [ cogit putSelfInReceiverResultReg ].
+ 	fail := cogit Jump: 0.
+ 	
+ 	"We reach this code is the object mutated is mutable"
+ 	mutableJump jmpTarget: cogit Label.
+ 	
+ 	^ fail!

Item was removed:
- ----- Method: CogObjectRepresentationForSpur>>genImmutableCheck:slotIndex:sourceReg:scratchReg:popBoolean:needRestoreRcvr: (in category 'compile abstract instructions') -----
- genImmutableCheck: regHoldingObjectMutated slotIndex: index sourceReg: regHoldingValueToStore scratchReg: scratchReg popBoolean: popBoolean needRestoreRcvr: needRestoreRcvr
- 	| mutableJump fail |
- 	<var: #mutableJump type: #'AbstractInstruction *'>
- 	<var: #fail type: #'AbstractInstruction *'>
- 	<inline: true>
- 	<option: #IMMUTABILITY>
- 	"Trampoline convention: 
- 	- objectMutated passed in ReceiverResultReg
- 	- index (unboxed) passed in TempReg
- 	- valueToStore passed in ClassReg.
- 	Simulated stack is flushed, but if needRestoreRcvr is true 
- 	the receiver has to be live after this operation."
- 	self assert: regHoldingObjectMutated == ReceiverResultReg. 
- 	self assert: scratchReg == TempReg.
- 	self assert: regHoldingValueToStore == ClassReg.
- 	mutableJump := self genJumpMutable: ReceiverResultReg scratchReg: TempReg.
- 	
- 	"We reach this code if the object mutated is immutable."
- 	cogit MoveCq: index R: TempReg.
- 	"trampoline call and mcpc to bcpc annotation."
- 	cogit CallRT: ceCannotAssignToWithIndexTrampoline.
- 	cogit annotateBytecode: cogit Label.
- 	"restore ReceiverResultReg state if needed, the rest of the state is spilled"
- 	needRestoreRcvr ifTrue: [ cogit putSelfInReceiverResultReg ].
- 	fail := cogit Jump: 0.
- 	
- 	"We reach this code is the object mutated is mutable"
- 	mutableJump jmpTarget: cogit Label.
- 	
- 	^ fail!

Item was changed:
  ----- Method: Cogit>>bcpcsAndDescriptorsFor:bsOffset:do: (in category 'tests-method map') -----
  bcpcsAndDescriptorsFor: aMethod bsOffset: bsOffset do: quaternaryBlock
  	"Evaluate quaternaryBlock with the pc, byte, descriptor and numExtensions for
  	 all the bytecodes in aMethod.  Evaluate with byte, descriptor and numExtensions
  	 nil for the initialPC of the mehtod and any blocks within it."
  	<doNotGenerate>
  	| nExts byte descriptor endpc latestContinuation pc primIdx |
  	((primIdx := coInterpreter primitiveIndexOf: aMethod) > 0
  	and: [coInterpreter isQuickPrimitiveIndex: primIdx]) ifTrue:
  		[^self].
  	latestContinuation := pc := coInterpreter startPCOfMethod: aMethod.
  	quaternaryBlock value: pc value: nil value: nil value: 0. "stackCheck/entry pc"
+ 	primIdx > 0 ifTrue:
+ 		[pc := pc + (self deltaToSkipPrimAndErrorStoreIn: aMethod
+ 							header: (coInterpreter methodHeaderOf: aMethod))].
  	nExts := 0.
  	endpc := objectMemory numBytesOf: aMethod.
  	[pc <= endpc] whileTrue:
  		[byte := objectMemory fetchByte: pc ofObject: aMethod.
  		descriptor := self generatorAt: byte + bsOffset.
  		descriptor isExtension ifFalse:
  			[quaternaryBlock value: pc value: byte value: descriptor value: nExts].
  		(descriptor isReturn
  		 and: [pc >= latestContinuation]) ifTrue:
  			[endpc := pc].
  		(descriptor isBranch
  		 or: [descriptor isBlockCreation]) ifTrue:
  			[| targetPC |
  			 descriptor isBlockCreation ifTrue:
  				[quaternaryBlock value: pc + descriptor numBytes value: nil value: nil value: 0]. "stackCheck/entry pc"
  			 targetPC := self latestContinuationPCFor: descriptor at: pc exts: nExts in: aMethod.
  			 self assert: targetPC < endpc.
  			 latestContinuation := latestContinuation max: targetPC].
  		pc := pc + descriptor numBytes.
  		nExts := descriptor isExtension ifTrue: [nExts + 1] ifFalse: [0]]!

Item was changed:
  ----- Method: Cogit>>compileMethodBody (in category 'compile abstract instructions') -----
  compileMethodBody
  	"Compile the top-level method body."
- 	| deltaForPrimErrorCode |
  	<inline: true>
  	endPC < initialPC ifTrue: [^0]. "quick primitives"
+ 	"When compiling, skip any initial CallPrimitive and optional StorePrimErrCode bytecodes.
+ 	 These are dealt with in compileFrameBuild."
+ 	^self compileAbstractInstructionsFrom: initialPC
+ 										+ (self deltaToSkipPrimAndErrorStoreIn: methodObj
+ 												header: methodHeader)
+ 		through: endPC!
- 	deltaForPrimErrorCode := self methodUsesPrimitiveErrorCode
- 									ifTrue: 
- 										[ self cppIf: IMMUTABILITY ifTrue: 
- 											[ self Nop. "to avoid conflict with interrrupt check mapping".
- 											self annotateBytecode: self Label ].
- 										(coInterpreter sizeOfCallPrimitiveBytecode: methodHeader)
- 										  + (coInterpreter sizeOfLongStoreTempBytecode: methodHeader)]
- 									ifFalse: [0].
- 	^self compileAbstractInstructionsFrom: initialPC + deltaForPrimErrorCode through: endPC!

Item was added:
+ ----- Method: Cogit>>deltaToSkipPrimAndErrorStoreIn:header: (in category 'compile abstract instructions') -----
+ deltaToSkipPrimAndErrorStoreIn: aMethodObj header: aMethodHeader
+ 	"Answer the number of bytecodes to skip to get to the first bytecode
+ 	 past the primitive call and any store of the error code."
+ 	^(self methodUsesPrimitiveErrorCode: aMethodObj header: aMethodHeader)
+ 		ifTrue: [(coInterpreter sizeOfCallPrimitiveBytecode: aMethodHeader)
+ 			  + (coInterpreter sizeOfLongStoreTempBytecode: aMethodHeader)]
+ 		ifFalse: [0]!

Item was changed:
  ----- Method: Cogit>>mapFor:bcpc:performUntil:arg: (in category 'method map') -----
  mapFor: cogMethod bcpc: startbcpc performUntil: functionSymbol arg: arg
  	"Machine-code <-> bytecode pc mapping support.  Evaluate functionSymbol
  	 for each mcpc, bcpc pair in the map until the function returns non-zero,
  	 answering that result, or 0 if it fails to.  This works only for frameful methods."
  	<var: #cogMethod type: #'CogBlockMethod *'>
  	<var: #functionSymbol declareC: 'sqInt (*functionSymbol)(BytecodeDescriptor *desc, sqInt isBackwardBranch, char *mcpc, sqInt bcpc, void *arg)'>
  	<var: #arg type: #'void *'>
  	<inline: true>
  	| isInBlock mcpc bcpc endbcpc map mapByte homeMethod aMethodObj result
  	  latestContinuation byte descriptor bsOffset nExts |
  	<var: #descriptor type: #'BytecodeDescriptor *'>
  	<var: #homeMethod type: #'CogMethod *'>
+ 
  	self assert: cogMethod stackCheckOffset > 0.
+ 	mcpc := cogMethod asUnsignedInteger + cogMethod stackCheckOffset.
+ 	"The stack check maps to the start of the first bytecode,
+ 	 the first bytecode being effectively after frame build."
+ 	result := self perform: functionSymbol
+ 					with: nil
+ 					with: false
+ 					with: (self cCoerceSimple: mcpc to: #'char *')
+ 					with: startbcpc
+ 					with: arg.
+ 	result ~= 0 ifTrue:
+ 		[^result].
+ 	bcpc := startbcpc.
  	"In both CMMethod and CMBlock cases find the start of the map and
  	 skip forward to the bytecode pc map entry for the stack check."
  	cogMethod cmType = CMMethod
  		ifTrue:
  			[isInBlock := false.
  			 homeMethod := self cCoerceSimple: cogMethod to: #'CogMethod *'.
  			 self assert: startbcpc = (coInterpreter startPCOfMethodHeader: homeMethod methodHeader).
  			 map := self mapStartFor: homeMethod.
  			 self assert: ((objectMemory byteAt: map) >> AnnotationShift = IsAbsPCReference
  						 or: [(objectMemory byteAt: map) >> AnnotationShift = IsObjectReference
  						 or: [(objectMemory byteAt: map) >> AnnotationShift = IsRelativeCall
  						 or: [(objectMemory byteAt: map) >> AnnotationShift = IsDisplacementX2N]]]).
  			 latestContinuation := startbcpc.
  			 aMethodObj := homeMethod methodObject.
  			 endbcpc := (objectMemory numBytesOf: aMethodObj) - 1.
+ 			 bsOffset := self bytecodeSetOffsetForHeader: homeMethod methodHeader.
+ 			"If the method has a primitive, skip it and the error code store, if any;
+ 			 Logically. these come before the stack check and so must be ignored."
+ 			 bcpc := bcpc + (self deltaToSkipPrimAndErrorStoreIn: aMethodObj
+ 									header: homeMethod methodHeader)]
- 			 bsOffset := self bytecodeSetOffsetForHeader: homeMethod methodHeader]
  		ifFalse:
  			[isInBlock := true.
+ 			 self assert: bcpc = cogMethod startpc.
  			 homeMethod := cogMethod cmHomeMethod.
  			 map := self findMapLocationForMcpc: cogMethod asUnsignedInteger + (self sizeof: CogBlockMethod)
  						inMethod: homeMethod.
  			 self assert: map ~= 0.
  			 self assert: ((objectMemory byteAt: map) >> AnnotationShift = HasBytecodePC "fiducial"
  						 or: [(objectMemory byteAt: map) >> AnnotationShift = IsDisplacementX2N]).
  			 [(objectMemory byteAt: map) >> AnnotationShift ~= HasBytecodePC] whileTrue:
  				[map := map - 1].
  			 map := map - 1. "skip fiducial; i.e. the map entry for the pc immediately following the method header."
  			 aMethodObj := homeMethod methodObject.
  			 bcpc := startbcpc - (self blockCreationBytecodeSizeForHeader: homeMethod methodHeader).
  			 bsOffset := self bytecodeSetOffsetForHeader: homeMethod methodHeader.
  			 byte := (objectMemory fetchByte: bcpc ofObject: aMethodObj) + bsOffset.
  			 descriptor := self generatorAt: byte.
+ 			 endbcpc := self nextBytecodePCFor: descriptor at: bcpc exts: -1 in: aMethodObj.
+ 			 bcpc := startbcpc].
- 			 endbcpc := self nextBytecodePCFor: descriptor at: bcpc exts: -1 in: aMethodObj].
- 	bcpc := startbcpc.
- 	mcpc := cogMethod asUnsignedInteger + cogMethod stackCheckOffset.
  	nExts := 0.
- 	"The stack check maps to the start of the first bytecode,
- 	 the first bytecode being effectively after frame build."
- 	result := self perform: functionSymbol
- 					with: nil
- 					with: false
- 					with: (self cCoerceSimple: mcpc to: #'char *')
- 					with: startbcpc
- 					with: arg.
- 	result ~= 0 ifTrue:
- 		[^result].
  	"Now skip up through the bytecode pc map entry for the stack check." 
  	[(objectMemory byteAt: map) >> AnnotationShift ~= HasBytecodePC] whileTrue:
  		[map := map - 1].
  	map := map - 1.
  	[(mapByte := objectMemory byteAt: map) ~= MapEnd] whileTrue: "defensive; we exit on bcpc"
  		[mapByte >= FirstAnnotation
  			ifTrue:
  				[| annotation nextBcpc isBackwardBranch |
  				annotation := mapByte >> AnnotationShift.
  				mcpc := mcpc + ((mapByte bitAnd: DisplacementMask) * backEnd codeGranularity).
  				(self isPCMappedAnnotation: annotation) ifTrue:
  					[[byte := (objectMemory fetchByte: bcpc ofObject: aMethodObj) + bsOffset.
  					  descriptor := self generatorAt: byte.
  					  isInBlock
  						ifTrue: [bcpc >= endbcpc ifTrue: [^0]]
  						ifFalse:
  							[(descriptor isReturn and: [bcpc >= latestContinuation]) ifTrue: [^0].
  							 (descriptor isBranch or: [descriptor isBlockCreation]) ifTrue:
  								[| targetPC |
  								 targetPC := self latestContinuationPCFor: descriptor at: bcpc exts: nExts in: aMethodObj.
  								 latestContinuation := latestContinuation max: targetPC]].
  					  nextBcpc := self nextBytecodePCFor: descriptor at: bcpc exts: nExts in: aMethodObj.
  					  descriptor isMapped
  					  or: [isInBlock and: [descriptor isMappedInBlock]]] whileFalse:
  						[bcpc := nextBcpc.
  						 nExts := descriptor isExtension ifTrue: [nExts + 1] ifFalse: [0]].
  					isBackwardBranch := descriptor isBranch
  										   and: [self isBackwardBranch: descriptor at: bcpc exts: nExts in: aMethodObj].
  					result := self perform: functionSymbol
  									with: descriptor
  									with: isBackwardBranch
  									with: (self cCoerceSimple: mcpc to: #'char *')
  									with: bcpc
  									with: arg.
  					 result ~= 0 ifTrue:
  						[^result].
  					 bcpc := nextBcpc.
  					 nExts := descriptor isExtension ifTrue: [nExts + 1] ifFalse: [0]]]
  			ifFalse:
  				[self assert: (mapByte >> AnnotationShift = IsDisplacementX2N
  							or: [mapByte >> AnnotationShift = IsAnnotationExtension]).
  				 mapByte < (IsAnnotationExtension << AnnotationShift) ifTrue:
  					[mcpc := mcpc + ((mapByte - DisplacementX2N << AnnotationShift) * backEnd codeGranularity)]].
  		 map := map - 1].
  	^0!

Item was added:
+ ----- Method: Cogit>>testBcToMcPcMappingForCogMethod: (in category 'testing') -----
+ testBcToMcPcMappingForCogMethod: cogMethod
+ 	<doNotGenerate>
+ 	"self disassembleMethod: cogMethod"
+ 	"self printPCMapPairsFor: cogMethod on: Transcript"
+ 	| aMethodObj subMethods bsOffset |
+ 	aMethodObj := cogMethod methodObject.
+ 	subMethods := self subMethodsAsRangesFor: cogMethod.
+ 	subMethods first endPC: (self endPCOf: aMethodObj).
+ 	bsOffset := self bytecodeSetOffsetFor: aMethodObj.
+ 	self bcpcsAndDescriptorsFor: aMethodObj bsOffset: bsOffset do:
+ 		[:bcpc :byte :desc :nExts| | subMethod |
+ 		(desc notNil and: [desc isBlockCreation]) ifTrue:
+ 			[subMethod := subMethods detect: [:sm| sm startpc = (bcpc + desc numBytes)].
+ 			 subMethod endPC: bcpc + desc numBytes + (self spanFor: desc at: bcpc exts: -1 in: aMethodObj) - 1]].
+ 	subMethods allButFirst do:
+ 		[:blockSubMethod| | cogBlockMethod |
+ 		cogBlockMethod := self
+ 								findMethodForStartBcpc: blockSubMethod startpc
+ 								inHomeMethod: cogMethod.
+ 		self assert: cogBlockMethod address = (blockSubMethod first - (self sizeof: CogBlockMethod))].
+ 	self bcpcsAndDescriptorsFor: aMethodObj bsOffset: bsOffset do:
+ 		[:bcpc :byte :desc :nExts| | startBcpc currentSubMethod subCogMethod absMcpc mappedBcpc |
+ 		currentSubMethod := self innermostSubMethodFor: bcpc in: subMethods startingAt: 1.
+ 		subCogMethod := currentSubMethod cogMethod.
+ 		(subCogMethod stackCheckOffset > 0
+ 		 and: [desc isNil or: [desc isMapped]]) ifTrue:
+ 			[startBcpc := subCogMethod = cogMethod
+ 							ifTrue: [coInterpreter startPCOfMethod: aMethodObj]
+ 							ifFalse: [currentSubMethod startpc].
+ 			 "The first bytecode and backward branch bytecodes are mapped to their pc.
+ 			  Other bytecodes map to their following pc."
+ 			 absMcpc := (desc notNil
+ 						   and: [desc isBranch
+ 						   and: [self isBackwardBranch: desc at: bcpc exts: nExts in: aMethodObj]])
+ 							ifTrue: "Backward branches have a special mapper"
+ 								[mappedBcpc := bcpc.
+ 								 self
+ 									mcPCForBackwardBranch: mappedBcpc
+ 									startBcpc: startBcpc
+ 									in: subCogMethod]
+ 							ifFalse: "All others use the generic mapper"
+ 								[mappedBcpc := desc ifNil: [bcpc] ifNotNil: [bcpc + desc numBytes].
+ 								 self
+ 									mcPCFor: mappedBcpc
+ 									startBcpc: startBcpc
+ 									in: subCogMethod].
+ 			 self assert: absMcpc >= (subCogMethod asInteger + subCogMethod stackCheckOffset).
+ 			 self assert: (self bytecodePCFor: absMcpc startBcpc: startBcpc in: subCogMethod) = mappedBcpc]]!

Item was removed:
- ----- Method: Cogit>>testBcToMcPcMappingForCompiledMethod:cogMethod: (in category 'tests-method map') -----
- testBcToMcPcMappingForCompiledMethod: aCompiledMethod cogMethod: cogMethod
- 	<doNotGenerate>
- 	"self disassembleMethod: cogMethod"
- 	"self printPCMapPairsFor: cogMethod on: Transcript"
- 	| aMethodObj subMethods bsOffset |
- 	aMethodObj := cogMethod methodObject.
- 	subMethods := self subMethodsAsRangesFor: cogMethod.
- 	subMethods first endPC: (self endPCOf: aMethodObj).
- 	bsOffset := self bytecodeSetOffsetFor: aMethodObj.
- 	self bcpcsAndDescriptorsFor: aMethodObj bsOffset: bsOffset do:
- 		[:bcpc :byte :desc :nExts| | subMethod |
- 		(desc notNil and: [desc isBlockCreation]) ifTrue:
- 			[subMethod := subMethods detect: [:sm| sm startpc = (bcpc + desc numBytes)].
- 			 subMethod endPC: bcpc + desc numBytes + (self spanFor: desc at: bcpc exts: -1 in: aMethodObj) - 1]].
- 	subMethods allButFirst do:
- 		[:blockSubMethod| | cogBlockMethod |
- 		cogBlockMethod := self
- 								findMethodForStartBcpc: blockSubMethod startpc
- 								inHomeMethod: cogMethod.
- 		self assert: cogBlockMethod address = (blockSubMethod first - (self sizeof: CogBlockMethod))].
- 	self bcpcsAndDescriptorsFor: aMethodObj bsOffset: bsOffset do:
- 		[:bcpc :byte :desc :nExts| | currentSubMethod subCogMethod absMcpc mappedBcpc |
- 		currentSubMethod := self innermostSubMethodFor: bcpc in: subMethods startingAt: 1.
- 		subCogMethod := currentSubMethod cogMethod.
- 		(subCogMethod stackCheckOffset > 0
- 		 and: [desc isNil or: [desc isMapped]]) ifTrue:
- 			["The first bytecode and backward branch bytecodes are mapped to their pc.
- 			  Other bytecodes map to their following pc."
- 			 absMcpc := (desc notNil
- 						   and: [desc isBranch
- 						   and: [self isBackwardBranch: desc at: bcpc exts: nExts in: aMethodObj]])
- 							ifTrue: "Backward branches have a special mapper"
- 								[mappedBcpc := bcpc.
- 								 self
- 									mcPCForBackwardBranch: mappedBcpc
- 									startBcpc: currentSubMethod startpc
- 									in: subCogMethod]
- 							ifFalse: "All others use the generic mapper"
- 								[mappedBcpc := desc ifNil: [bcpc] ifNotNil: [bcpc + desc numBytes].
- 								 self
- 									mcPCFor: mappedBcpc
- 									startBcpc: currentSubMethod startpc
- 									in: subCogMethod].
- 			 self assert: absMcpc >= (subCogMethod asInteger + subCogMethod stackCheckOffset).
- 			 self assert: (self bytecodePCFor: absMcpc startBcpc: currentSubMethod startpc in: subCogMethod) = mappedBcpc]]!

Item was changed:
  ----- Method: Cogit>>testPCMappingForCompiledMethod:cogMethod: (in category 'tests-method map') -----
  testPCMappingForCompiledMethod: aCompiledMethod cogMethod: cm
  	<doNotGenerate>
  	methodObj := methodHeader := nil.
  	self
  		testMcToBcPcMappingForCompiledMethod: aCompiledMethod cogMethod: cm;
+ 		testBcToMcPcMappingForCogMethod: cm!
- 		testBcToMcPcMappingForCompiledMethod: aCompiledMethod cogMethod: cm!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>compileFrameBuild (in category 'compile abstract instructions') -----
  compileFrameBuild
  	"Build a frame for a CogMethod activation.  See CoInterpreter class>>initializeFrameIndices.
  	 		receiver (in ReceiverResultReg)
  			arg0
  			...
  			argN
  			caller's saved ip/this stackPage (for a base frame)
  	fp->	saved fp
  			method
  			context (uninitialized?)
  			receiver
  			first temp
  			...
  	sp->	Nth temp
  	If there is a primitive and an error code the Nth temp is the error code.
  	Ensure SendNumArgsReg is set early on (incidentally to nilObj) because
  	it is the flag determining whether context switch is allowed on stack-overflow."
  	| jumpSkip |
  	<inline: false>
  	<var: #jumpSkip type: #'AbstractInstruction *'>
  	needsFrame ifFalse: [^self].
  	backEnd hasLinkRegister ifTrue: [self PushR: LinkReg].
  	self PushR: FPReg.
  	self MoveR: SPReg R: FPReg.
  	methodLabel addDependent: (self annotateAbsolutePCRef:
  		(self PushCw: methodLabel asInteger)). "method"
  	self genMoveConstant: objectMemory nilObject R: SendNumArgsReg.
  	self PushR: SendNumArgsReg. "context"
  	self PushR: ReceiverResultReg.
  	methodOrBlockNumArgs + 1 to: (coInterpreter temporaryCountOfMethodHeader: methodHeader) do:
  		[:i|
  		self PushR: SendNumArgsReg].
+ 	(self methodUsesPrimitiveErrorCode: methodObj header: methodHeader) ifTrue:
- 	self methodUsesPrimitiveErrorCode ifTrue:
  		[self compileGetErrorCode].
  	self MoveAw: coInterpreter stackLimitAddress R: TempReg.
  	self CmpR: TempReg R: SPReg. "N.B. FLAGS := SPReg - TempReg"
  	"If we can't context switch for this method, use a slightly
  	 slower overflow check that clears SendNumArgsReg."
  	(coInterpreter canContextSwitchIfActivating: methodObj header: methodHeader)
  		ifTrue:
  			[self JumpBelow: stackOverflowCall.
  			 stackCheckLabel := self Label]
  		ifFalse:
  			[jumpSkip := self JumpAboveOrEqual: 0.
  			 self MoveCq: 0 R: SendNumArgsReg.
  			 self Jump: stackOverflowCall.
  			 jumpSkip jmpTarget: (stackCheckLabel := self Label)].
  	self annotateBytecode: stackCheckLabel.
  	self cppIf: #NewspeakVM ifTrue:
  		[numIRCs > 0 ifTrue:
  		 	[self PrefetchAw: theIRCs]]!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>compilePrimitive (in category 'primitive generators') -----
  compilePrimitive
  	"Compile a primitive.  If possible, performance-critical primtiives will
  	 be generated by their own routines (primitiveGenerator).  Otherwise,
  	 if there is a primitive at all, we call the C routine with the usual
  	 stack-switching dance, test the primFailCode and then either return
  	 on success or continue to the method body."
  	<inline: false>
  	| code opcodeIndexAtPrimitive primitiveDescriptor primitiveRoutine |
  	<var: #primitiveDescriptor type: #'PrimitiveDescriptor *'>
  	<var: #primitiveRoutine declareC: 'void (*primitiveRoutine)(void)'>
  	primitiveIndex = 0 ifTrue: [^0].
  	code := 0.
  	"Note opcodeIndex so that compileFallbackToInterpreterPrimitive:
  	 can discard arg load instructions for unimplemented primitives."
  	opcodeIndexAtPrimitive := opcodeIndex.
  	"If a descriptor specifies an argument count (by numArgs >= 0) then it must match
  	 for the generated code to be correct.  For example for speed many primitives use
  	 ResultReceiverReg instead of accessing the stack, so the receiver better be at
  	 numArgs down the stack.  Use the interpreter version if not."
  	((primitiveDescriptor := self primitiveGeneratorOrNil) notNil
  	 and: [primitiveDescriptor primitiveGenerator notNil
  	 and: [(primitiveDescriptor primNumArgs < 0 "means don't care"
  		   or: [primitiveDescriptor primNumArgs = (coInterpreter argumentCountOf: methodObj)])]]) ifTrue:
  		[code := objectRepresentation perform: primitiveDescriptor primitiveGenerator].
  	(code < 0 and: [code ~= UnimplementedPrimitive]) ifTrue: "Generator failed, so no point continuing..."
  		[^code].
  	code = UnfailingPrimitive ifTrue:
  		[^0].
+ 	"If the machine code verison handles all cases the only reason to call the interpreter
+ 	 primitive is to reap the primitive error code.  Don't bother if it isn't used."
  	(code = CompletePrimitive
+ 	 and: [(self methodUsesPrimitiveErrorCode: methodObj header: methodHeader) not]) ifTrue:
- 	 and: [(self methodUsesPrimitiveErrorCode) not]) ifTrue:
  		[^0].
  	"Discard any arg load code generated by the primitive generator."
  	code = UnimplementedPrimitive ifTrue:
  		[opcodeIndex := opcodeIndexAtPrimitive].
  	((primitiveRoutine := coInterpreter
  							functionPointerForCompiledMethod: methodObj
  							primitiveIndex: primitiveIndex) isNil "no primitive"
  	or: [primitiveRoutine = (coInterpreter functionPointerFor: 0 inClass: nil) "routine = primitiveFail"]) ifTrue:
  		[^self genFastPrimFail].
  	minValidCallAddress := minValidCallAddress min: primitiveRoutine asUnsignedInteger.
  	^self compileInterpreterPrimitive: primitiveRoutine!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>doubleExtendedDoAnythingBytecode (in category 'bytecode generators') -----
  doubleExtendedDoAnythingBytecode
  	"Replaces the Blue Book double-extended send [132], in which the first byte was wasted on 8 bits of argument count. 
  	Here we use 3 bits for the operation sub-type (opType),  and the remaining 5 bits for argument count where needed. 
  	The last byte give access to 256 instVars or literals. 
  	See also secondExtendedSendBytecode"
  	| opType |
  	opType := byte1 >> 5.
  	opType = 0 ifTrue:
  		[^self genSend: byte2 numArgs: (byte1 bitAnd: 31)].
  	opType = 1 ifTrue:
  		[^self genSendSuper: byte2 numArgs: (byte1 bitAnd: 31)].
  	"We need a map entry for this bytecode for correct parsing.
+ 	 The sends will get an IsSend entry anyway.  The other cases need a fake one."
- 	 The sends will get an IsSend entry anyway.  The other cases need a
- 	 fake one.  We could of course special case the scanning but that's silly."
  	opType caseOf: {
  			[2]	->	[(coInterpreter isReadMediatedContextInstVarIndex: byte2)
  						ifTrue: [self genPushMaybeContextReceiverVariable: byte2]
  						ifFalse: [self genPushReceiverVariable: byte2]].
  			[3]	->	[self genPushLiteralIndex: byte2].
  			[4]	->	[self genPushLiteralVariable: byte2].
+ 			[7]	->	[self genStorePop: false LiteralVariable: byte2.
+ 					 self cppIf: IMMUTABILITY ifTrue: ["genStorePop:LiteralVariable: annotates; don't annotate twice" ^0]] }
- 			[7]	->	[self genStorePop: false LiteralVariable: byte2] }
  		otherwise: "5 & 6"
  			[(coInterpreter isWriteMediatedContextInstVarIndex: byte2)
  				ifTrue: [self genStorePop: opType = 6 MaybeContextReceiverVariable: byte2]
+ 				ifFalse: [self genStorePop: opType = 6 ReceiverVariable: byte2].
+ 			 self cppIf: IMMUTABILITY ifTrue: ["genStorePop:...ReceiverVariable: annotate; don't annotate twice" ^0]].
+ 	"We need a map entry for this bytecode for correct parsing (if the method builds a frame)."
- 				ifFalse: [self genStorePop: opType = 6 ReceiverVariable: byte2]].
- 	"We need a map entry for this bytecode for correct parsing (if the method builds a frame).
- 	 We could of course special case the scanning but that's silly."
  	needsFrame ifTrue:
  		[self annotateBytecode: self Label].
  	^0!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>extendedStoreAndPopBytecode (in category 'bytecode generators') -----
  extendedStoreAndPopBytecode
  	| variableType variableIndex |
  	variableType := byte1 >> 6 bitAnd: 3.
  	variableIndex := byte1 bitAnd: 63.
  	variableType = 0 ifTrue:
  		[^self genStorePop: true ReceiverVariable: variableIndex].
  	variableType = 1 ifTrue:
  		[self genStorePop: true TemporaryVariable: variableIndex.
+ 		"needs a fake map entry if Immutability is ON..."
- 		"needs a fake map entry is Immutability is ON..."
  		self cppIf: IMMUTABILITY ifTrue: [ self annotateBytecode: self Label. ].
  		^ 0].
  	variableType = 3 ifTrue:
  		[^self genStorePop: true LiteralVariable: variableIndex].
  	^EncounteredUnknownBytecode!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>extendedStoreBytecode (in category 'bytecode generators') -----
  extendedStoreBytecode
  	| variableType variableIndex |
  	variableType := byte1 >> 6 bitAnd: 3.
  	variableIndex := byte1 bitAnd: 63.
  	variableType = 0 ifTrue:
  		[^self genStorePop: false ReceiverVariable: variableIndex].
  	variableType = 1 ifTrue:
  		[self genStorePop: false TemporaryVariable: variableIndex.
+ 		"needs a fake map entry if Immutability is ON..."
- 		"needs a fake map entry is Immutability is ON..."
  		self cppIf: IMMUTABILITY ifTrue: [ self annotateBytecode: self Label. ].
  		^ 0].
  	variableType = 3 ifTrue:
  		[^self genStorePop: false LiteralVariable: variableIndex].
  	^EncounteredUnknownBytecode!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>genPushMaybeContextReceiverVariable: (in category 'bytecode generator support') -----
  genPushMaybeContextReceiverVariable: slotIndex 
  	<inline: false>
  	| jmpSingle jmpDone |
  	<var: #jmpSingle type: #'AbstractInstruction *'>
  	<var: #jmpDone type: #'AbstractInstruction *'>
  	self assert: needsFrame.
  	"See CoInterpreter>>contextInstructionPointer:frame: for an explanation
  	 of the instruction pointer slot handling."
  	slotIndex = InstructionPointerIndex ifTrue:
+ 		[self putSelfInReceiverResultReg.
- 		[self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg.
  		 self MoveCq: slotIndex R: SendNumArgsReg.
  		 self CallRT: ceFetchContextInstVarTrampoline.
  		 self PushR: SendNumArgsReg.
  		 ^0].
  	self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg.
  	objectRepresentation
  		genLoadSlot: SenderIndex
  		sourceReg: ReceiverResultReg
  		destReg: TempReg.
  	jmpSingle := objectRepresentation genJumpNotSmallIntegerInScratchReg: TempReg.
  	self MoveCq: slotIndex R: SendNumArgsReg.
  	self CallRT: ceFetchContextInstVarTrampoline.
  	jmpDone := self Jump: 0.
  	jmpSingle jmpTarget: self Label.
  	objectRepresentation
  		genLoadSlot: slotIndex
  		sourceReg: ReceiverResultReg
  		destReg: SendNumArgsReg.
  	jmpDone jmpTarget: (self PushR: SendNumArgsReg).
  	^0!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>genPushReceiverVariable: (in category 'bytecode generator support') -----
  genPushReceiverVariable: index
  	<inline: false>
  	| maybeErr |
  	needsFrame ifTrue:
+ 		[self putSelfInReceiverResultReg].
- 		[self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg].
  	maybeErr := objectRepresentation genLoadSlot: index sourceReg: ReceiverResultReg destReg: TempReg.
  	maybeErr < 0 ifTrue:
  		[^maybeErr].
  	self PushR: TempReg.
  	^0!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>genReturnReceiver (in category 'bytecode generators') -----
  genReturnReceiver
  	"Frameless method activation looks like
  				receiver
  				args
  		sp->	ret pc.
  	 Return pops receiver and arguments off the stack.  Callee pushes the result."
  	needsFrame ifTrue:
+ 		[self putSelfInReceiverResultReg].
- 		[self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg].
  	^self genUpArrowReturn!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>genStorePop:LiteralVariable: (in category 'bytecode generator support') -----
  genStorePop: popBoolean LiteralVariable: litVarIndex
  	<inline: false>
+ 	| association immutabilityFailure |
+ 	<var: #immutabilityFailure type: #'AbstractInstruction *'>
- 	| association |
  	"The only reason we assert needsFrame here is that in a frameless method
  	 ReceiverResultReg must and does contain only self, but the ceStoreCheck
  	 trampoline expects the target of the store to be in ReceiverResultReg.  So
  	 in a frameless method we would have a conflict between the receiver and
  	 the literal store, unless we we smart enough to realise that ReceiverResultReg
  	 was unused after the literal variable store, unlikely given that methods
  	 return self by default."
  	self assert: needsFrame.
  	association := self getLiteral: litVarIndex.
  	self genMoveConstant: association R: ReceiverResultReg.
  	objectRepresentation
  		genEnsureObjInRegNotForwarded: ReceiverResultReg
  		scratchReg: TempReg.
  	popBoolean
  		ifTrue: [self PopR: ClassReg]
  		ifFalse: [self MoveMw: 0 r: SPReg R: ClassReg].
  	traceStores > 0 ifTrue:
  		[self CallRT: ceTraceStoreTrampoline].
+ 	self cppIf: IMMUTABILITY ifTrue: 
+ 		[immutabilityFailure := objectRepresentation
+ 									genImmutableCheck: ReceiverResultReg
+ 									slotIndex: ValueIndex
+ 									sourceReg: ClassReg
+ 									scratchReg: TempReg
+ 									needRestoreRcvr: true].
+ 	objectRepresentation
- 	^objectRepresentation
  		genStoreSourceReg: ClassReg
  		slotIndex: ValueIndex
  		destReg: ReceiverResultReg
  		scratchReg: TempReg
+ 		inFrame: needsFrame.
+ 
+ 	self cppIf: IMMUTABILITY ifTrue:
+ 		[immutabilityFailure jmpTarget: self Label].
+ 
+ 	^0!
- 		inFrame: needsFrame!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>genStorePop:MaybeContextReceiverVariable: (in category 'bytecode generator support') -----
  genStorePop: popBoolean MaybeContextReceiverVariable: slotIndex
  	<inline: false>
+ 	| jmpSingle jmpDone immutabilityFailure |
+ 	<var: #immutabilityFailure type: #'AbstractInstruction *'>
- 	| jmpSingle jmpDone |
  	<var: #jmpSingle type: #'AbstractInstruction *'>
  	<var: #jmpDone type: #'AbstractInstruction *'>
  	"The reason we need a frame here is that assigning to an inst var of a context may
  	 involve wholesale reorganization of stack pages, and the only way to preserve the
  	 execution state of an activation in that case is if it has a frame."
  	self assert: needsFrame.
+ 	self putSelfInReceiverResultReg.
- 	self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg.
  	objectRepresentation
  		genLoadSlot: SenderIndex
  		sourceReg: ReceiverResultReg
  		destReg: TempReg.
  	self MoveMw: 0 r: SPReg R: ClassReg.
  	jmpSingle := objectRepresentation genJumpNotSmallIntegerInScratchReg: TempReg.
  	self MoveCq: slotIndex R: SendNumArgsReg.
  	self CallRT: ceStoreContextInstVarTrampoline.
  	jmpDone := self Jump: 0.
  	jmpSingle jmpTarget: self Label.
  	traceStores > 0 ifTrue:
  		[self CallRT: ceTraceStoreTrampoline].
+ 	self cppIf: IMMUTABILITY ifTrue: 
+ 		[immutabilityFailure := objectRepresentation
+ 									genImmutableCheck: ReceiverResultReg
+ 									slotIndex: slotIndex
+ 									sourceReg: ClassReg
+ 									scratchReg: TempReg
+ 									needRestoreRcvr: true].
  	objectRepresentation
  		genStoreSourceReg: ClassReg
  		slotIndex: slotIndex
  		destReg: ReceiverResultReg
  		scratchReg: TempReg
  		inFrame: true.
  	jmpDone jmpTarget: self Label.
  	popBoolean ifTrue:
  		[self AddCq: objectMemory wordSize R: SPReg].
+ 	self cppIf: IMMUTABILITY ifTrue:
+ 		[immutabilityFailure jmpTarget: self Label].
  	^0!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>genStorePop:ReceiverVariable: (in category 'bytecode generator support') -----
  genStorePop: popBoolean ReceiverVariable: slotIndex
  	<inline: false>
+ 	| immutabilityFailure |
+ 	<var: #immutabilityFailure type: #'AbstractInstruction *'>
  	needsFrame ifTrue:
+ 		[self putSelfInReceiverResultReg].
- 		[self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg].
  	popBoolean
  		ifTrue: [self PopR: ClassReg]
  		ifFalse: [self MoveMw: 0 r: SPReg R: ClassReg].
  	traceStores > 0 ifTrue:
  		[self CallRT: ceTraceStoreTrampoline].
+ 	self cppIf: IMMUTABILITY ifTrue: 
+ 		[immutabilityFailure := objectRepresentation
+ 									genImmutableCheck: ReceiverResultReg
+ 									slotIndex: slotIndex
+ 									sourceReg: ClassReg
+ 									scratchReg: TempReg
+ 									needRestoreRcvr: true].
+ 	objectRepresentation
- 	^objectRepresentation
  		genStoreSourceReg: ClassReg
  		slotIndex: slotIndex
  		destReg: ReceiverResultReg
  		scratchReg: TempReg
+ 		inFrame: needsFrame.
+ 
+ 	self cppIf: IMMUTABILITY ifTrue:
+ 		[immutabilityFailure jmpTarget: self Label].
+ 
+ 	^0!
- 		inFrame: needsFrame!

Item was changed:
  ----- Method: SimpleStackBasedCogit>>marshallAbsentReceiverSendArguments: (in category 'bytecode generators') -----
  marshallAbsentReceiverSendArguments: numArgs
  	self assert: needsFrame.
+ 	self putSelfInReceiverResultReg.
- 	self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg.
  
  	"Shuffle arguments if necessary and push receiver."
  	numArgs = 0
  		ifTrue:
  			[self PushR: ReceiverResultReg]
  		ifFalse:
  			[self MoveMw: 0 r: SPReg R: TempReg.
  			self PushR: TempReg.
  			2 to: numArgs do:
  				[:index|
  				self MoveMw: index * objectMemory wordSize r: SPReg R: TempReg.
  				self MoveR: TempReg Mw: index - 1 * BytesPerWord r: SPReg].
  			self MoveR: ReceiverResultReg Mw: numArgs * BytesPerWord r: SPReg].!

Item was added:
+ ----- Method: SimpleStackBasedCogit>>methodUsesPrimitiveErrorCode:header: (in category 'compile abstract instructions') -----
+ methodUsesPrimitiveErrorCode: aMethodObj header: aMethodHeader
+ 	"Answer if aMethodObj contains a primitive and uses the primitive error code."
+ 	<inline: true>
+ 	^(coInterpreter primitiveIndexOfMethod: aMethodObj header: aMethodHeader) > 0
+ 	  and: [(coInterpreter longStoreBytecodeForHeader: aMethodHeader)
+ 			= (objectMemory
+ 				fetchByte: initialPC + (coInterpreter sizeOfCallPrimitiveBytecode: aMethodHeader)
+ 				ofObject: aMethodObj)]!

Item was added:
+ ----- Method: SimpleStackBasedCogit>>putSelfInReceiverResultReg (in category 'bytecode generator support') -----
+ putSelfInReceiverResultReg
+ 	<inline: true>
+ 	self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg!

Item was changed:
  ----- Method: SistaStackToRegisterMappingCogit>>genMustBeBooleanTrampolineFor:called: (in category 'initialization') -----
  genMustBeBooleanTrampolineFor: boolean called: trampolineName
  	"This can be entered in one of two states, depending on SendNumArgsReg. See
  	 e.g. genJumpIf:to:.  If SendNumArgsReg is non-zero then this has been entered via
  	 the initial test of the counter in the jump executed count (i.e. the counter has
  	 tripped).  In this case TempReg contains the boolean to be tested and should not
  	 be offset, and ceCounterTripped should be invoked with the unoffset TempReg.
  	 If SendNumArgsReg is zero then this has been entered for must-be-boolean
  	 processing. TempReg has been offset by boolean and must be corrected and
  	 ceSendMustBeBoolean: invoked with the corrected value."
  	<var: #trampolineName type: #'char *'>
  	| jumpMBB |
  	<var: #jumpMBB type: #'AbstractInstruction *'>
  	<inline: false>
  	self zeroOpcodeIndex.
  	self CmpCq: 0 R: SendNumArgsReg.
  	jumpMBB := self JumpZero: 0.
  	"Open-code self compileTrampolineFor: #ceCounterTripped: numArgs: 1 arg: TempReg ...
  	 so we can restore ResultReceiverReg."
  	self genSmalltalkToCStackSwitch: true.
  	self
  		compileCallFor: #ceCounterTripped:
  		numArgs: 1
  		arg: TempReg
  		arg: nil
  		arg: nil
  		arg: nil
  		resultReg: TempReg "(*)"
  		saveRegs: false.
  	"(*) For the case where the ceCounterTripped: call returns (e.g. because there's no callback selector
  	 installed), the call to the ceSendMustBeBooleanAddTrue/FalseTrampoline is followed by a jump
  	 back to the start of the counter/condition test sequence.  For this case copy the C result to
  	 TempReg (the register that is tested), to reload it with the boolean to be tested."
  	backEnd genLoadStackPointers.
  	backEnd hasLinkRegister ifTrue:
  		[self PopR: LinkReg].
+ 	"To keep ResultReceiverReg live if optStatus thought it was, simply reload it
- 	"To keep ResultReceiverReg live if optStatus thiught it was, simply reload it
  	 from the frame pointer.  This avoids having to reload it in the common case
  	 (counter does not trip) if it was live."
+ 	self putSelfInReceiverResultReg.
- 	self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg.
  	self RetN: 0.
  	"If the objectRepresentation does want true & false to be mobile then we need to record these addresses."
  	self assert: (objectRepresentation shouldAnnotateObjectReference: boolean) not.
  	jumpMBB jmpTarget: (self AddCq: boolean R: TempReg).
  	^self genTrampolineFor: #ceSendMustBeBoolean:
  		called: trampolineName
  		numArgs: 1
  		arg: TempReg
  		arg: nil
  		arg: nil
  		arg: nil
  		saveRegs: false
  		pushLinkReg: true
  		resultReg: NoReg
  		appendOpcodes: true!

Item was changed:
  ----- Method: StackToRegisterMappingCogit>>doubleExtendedDoAnythingBytecode (in category 'bytecode generators') -----
  doubleExtendedDoAnythingBytecode
  	"Replaces the Blue Book double-extended send [132], in which the first byte was wasted on 8 bits of argument count. 
  	Here we use 3 bits for the operation sub-type (opType),  and the remaining 5 bits for argument count where needed. 
  	The last byte give access to 256 instVars or literals. 
  	See also secondExtendedSendBytecode"
  	| opType |
  	opType := byte1 >> 5.
  	opType = 0 ifTrue:
  		[^self genSend: byte2 numArgs: (byte1 bitAnd: 31)].
  	opType = 1 ifTrue:
  		[^self genSendSuper: byte2 numArgs: (byte1 bitAnd: 31)].
  	"We need a map entry for this bytecode for correct parsing.
+ 	 The sends will get an IsSend entry anyway.  The other cases need a fake one."
- 	 The sends will get an IsSend entry anyway.  The other cases need a
- 	 fake one.  We could of course special case the scanning but that's silly."
  	opType caseOf: {
  			[2]	->	[(coInterpreter isReadMediatedContextInstVarIndex: byte2)
  						ifTrue: [self genPushMaybeContextReceiverVariable: byte2]
  						ifFalse: [self genPushReceiverVariable: byte2.
  								self ssTop annotateUse: true.
  								^0]].
  			[3]	->	[self genPushLiteralIndex: byte2.
  					 self ssTop annotateUse: true.
  					 ^0].
  			[4]	->	[self genPushLiteralVariable: byte2.].
  			[7]	->	[self genStorePop: false LiteralVariable: byte2.
+ 					self cppIf: IMMUTABILITY ifTrue: [ "genStorePop:LiteralVariable: annotates; don't annotate twice" ^0 ] ] }
- 					self cppIf: IMMUTABILITY ifTrue: [ "instruction is mapped" ^0 ] ] }
  		otherwise: "5 & 6"
  			[(coInterpreter isWriteMediatedContextInstVarIndex: byte2)
  				ifTrue: [self genStorePop: opType = 6 MaybeContextReceiverVariable: byte2]
  				ifFalse: [self genStorePop: opType = 6 ReceiverVariable: byte2].
+ 			self cppIf: IMMUTABILITY ifTrue: [ "genStorePop:LiteralVariable: annotates; don't annotate twice" ^0 ]].
+ 	"We need a map entry for this bytecode for correct parsing (if the method builds a frame)."
- 			self cppIf: IMMUTABILITY ifTrue: [ "instruction is mapped" ^0 ]].
- 	"We need a map entry for this bytecode for correct parsing (if the method builds a frame).
- 	 We could of course special case the scanning but that's silly (or is it?)."
  	self assert: needsFrame.
  	"genPushMaybeContextInstVar, pushListVar, store & storePop all generate code"
  	self assert: self prevInstIsPCAnnotated not.
  	self annotateBytecode: self Label.
  	^0!

Item was changed:
  ----- Method: StackToRegisterMappingCogit>>genStorePop:LiteralVariable: (in category 'bytecode generator support') -----
  genStorePop: popBoolean LiteralVariable: litVarIndex
  	<inline: false>
  	| topReg association needStoreCheck immutabilityFailure |
  	"The only reason we assert needsFrame here is that in a frameless method
  	 ReceiverResultReg must and does contain only self, but the ceStoreCheck
  	 trampoline expects the target of the store to be in ReceiverResultReg.  So
  	 in a frameless method we would have a conflict between the receiver and
  	 the literal store, unless we we smart enough to realise that ReceiverResultReg
  	 was unused after the literal variable store, unlikely given that methods
  	 return self by default."
  	self assert: needsFrame.
  	self cppIf: IMMUTABILITY ifTrue: [ self ssFlushTo: simStackPtr - 1 ].
  	"N.B.  No need to check the stack for references because we generate code for
  	 literal variable loads that stores the result in a register, deferring only the register push."
  	needStoreCheck := (objectRepresentation isUnannotatableConstant: self ssTop) not.
  	association := self getLiteral: litVarIndex.
  	optStatus isReceiverResultRegLive: false.
  	self ssAllocateRequiredReg: ReceiverResultReg. "for ceStoreCheck call in genStoreSourceReg: has to be ReceiverResultReg"
  	self genMoveConstant: association R: ReceiverResultReg.
  	objectRepresentation genEnsureObjInRegNotForwarded: ReceiverResultReg scratchReg: TempReg.
  	self 
  		cppIf: IMMUTABILITY
  		ifTrue: 
  			[ self ssAllocateRequiredReg: ClassReg.
  			  topReg := ClassReg.
  			  self ssStoreAndReplacePop: popBoolean toReg: ClassReg.
  			  "stack is flushed except maybe ssTop if popBoolean is false.
  			  ssTop is a SSregister in this case due to #ssStoreAndReplacePop:
  			  to avoid a second indirect read / annotation in case of SSConstant
  			  or SSBaseRegister"
  			  self ssFlushTo: simStackPtr.
+ 			  immutabilityFailure := objectRepresentation
+ 										genImmutableCheck: ReceiverResultReg
+ 										slotIndex: ValueIndex
+ 										sourceReg: ClassReg
+ 										scratchReg: TempReg
+ 										needRestoreRcvr: false ]
- 			  immutabilityFailure := objectRepresentation 
- 				genImmutableCheck: ReceiverResultReg 
- 				slotIndex: ValueIndex 
- 				sourceReg: ClassReg 
- 				scratchReg: TempReg 
- 				popBoolean: popBoolean
- 				needRestoreRcvr: false ]
  		ifFalse: 
  			[ topReg := self allocateRegForStackEntryAt: 0 notConflictingWith: (self registerMaskFor: ReceiverResultReg).
  			  self ssStorePop: popBoolean toReg: topReg ].
  	traceStores > 0 ifTrue:
  		[self MoveR: topReg R: TempReg.
  		 self CallRT: ceTraceStoreTrampoline].
  	objectRepresentation
  		genStoreSourceReg: topReg
  		slotIndex: ValueIndex
  		destReg: ReceiverResultReg
  		scratchReg: TempReg
  		inFrame: needsFrame
  		needsStoreCheck: needStoreCheck.
  	self cppIf: IMMUTABILITY ifTrue: [ immutabilityFailure jmpTarget: self Label ].
  	^ 0!

Item was changed:
  ----- Method: StackToRegisterMappingCogit>>genStorePop:MaybeContextReceiverVariable: (in category 'bytecode generator support') -----
  genStorePop: popBoolean MaybeContextReceiverVariable: slotIndex
  	<inline: false>
  	| jmpSingle jmpDone needStoreCheck immutabilityFailure |
  	<var: #jmpSingle type: #'AbstractInstruction *'>
  	<var: #jmpDone type: #'AbstractInstruction *'>
  	"The reason we need a frame here is that assigning to an inst var of a context may
  	 involve wholesale reorganization of stack pages, and the only way to preserve the
  	 execution state of an activation in that case is if it has a frame."
  	self assert: needsFrame.
  	self cppIf: IMMUTABILITY ifTrue: [ self ssFlushTo: simStackPtr - 1 ].
  	self ssFlushUpThroughReceiverVariable: slotIndex.
  	needStoreCheck := (objectRepresentation isUnannotatableConstant: self ssTop) not.
  	"Note that ReceiverResultReg remains live after both
  	 ceStoreContextInstVarTrampoline and ceStoreCheckTrampoline."
  	self ensureReceiverResultRegContainsSelf.
  	self ssPop: 1.
  	self ssAllocateCallReg: ClassReg and: SendNumArgsReg. "for ceStoreContextInstVarTrampoline"
  	self ssPush: 1.
  	objectRepresentation
  		genLoadSlot: SenderIndex
  		sourceReg: ReceiverResultReg
  		destReg: TempReg.
  	self 
  		cppIf: IMMUTABILITY
  		ifTrue: 
  			[ self ssStoreAndReplacePop: popBoolean toReg: ClassReg.
  			  "stack is flushed except maybe ssTop if popBoolean is false.
  			  ssTop is a SSregister in this case due to #ssStoreAndReplacePop:
  			  to avoid a second indirect read / annotation in case of SSConstant
  			  or SSBaseRegister"
  			  self ssFlushTo: simStackPtr. ]
  		ifFalse: [ self ssStorePop: popBoolean toReg: ClassReg ].
  	jmpSingle := objectRepresentation genJumpNotSmallIntegerInScratchReg: TempReg.
  	self MoveCq: slotIndex R: SendNumArgsReg.
  	self CallRT: ceStoreContextInstVarTrampoline.
  	jmpDone := self Jump: 0.
  	jmpSingle jmpTarget: self Label.
  	traceStores > 0 ifTrue:
  		[self MoveR: ClassReg R: TempReg.
  		 self CallRT: ceTraceStoreTrampoline].
  	self 
  		cppIf: IMMUTABILITY
  		ifTrue: 
+ 			[ immutabilityFailure := objectRepresentation
+ 										genImmutableCheck: ReceiverResultReg
+ 										slotIndex: ValueIndex
+ 										sourceReg: ClassReg
+ 										scratchReg: TempReg
+ 										needRestoreRcvr: true ].
- 			[ immutabilityFailure := objectRepresentation 
- 				genImmutableCheck: ReceiverResultReg 
- 				slotIndex: ValueIndex 
- 				sourceReg: ClassReg 
- 				scratchReg: TempReg 
- 				popBoolean: popBoolean
- 				needRestoreRcvr: true ].
  	objectRepresentation
  		genStoreSourceReg: ClassReg
  		slotIndex: slotIndex
  		destReg: ReceiverResultReg
  		scratchReg: TempReg
  		inFrame: true
  		needsStoreCheck: needStoreCheck.
  	jmpDone jmpTarget: self Label.
  	self cppIf: IMMUTABILITY ifTrue: [ immutabilityFailure jmpTarget: self Label ].
  	^0!

Item was changed:
  ----- Method: StackToRegisterMappingCogit>>genStorePop:ReceiverVariable: (in category 'bytecode generator support') -----
  genStorePop: popBoolean ReceiverVariable: slotIndex
  	<inline: false>
  	| topReg needStoreCheck immutabilityFailure |
  	self cppIf: IMMUTABILITY ifTrue: [ self assert: needsFrame. self ssFlushTo: simStackPtr - 1 ].
  	self ssFlushUpThroughReceiverVariable: slotIndex.
  	needStoreCheck := (objectRepresentation isUnannotatableConstant: self ssTop) not.
  	"Note that ReceiverResultReg remains live after ceStoreCheckTrampoline."
  	self ensureReceiverResultRegContainsSelf.
  	self 
  		cppIf: IMMUTABILITY
  		ifTrue: 
  			[ self ssAllocateRequiredReg: ClassReg.
  			  topReg := ClassReg.
  			  self ssStoreAndReplacePop: popBoolean toReg: ClassReg.
  			  "stack is flushed except maybe ssTop if popBoolean is false.
  			  ssTop is a SSregister in this case due to #ssStoreAndReplacePop:
  			  to avoid a second indirect read / annotation in case of SSConstant
  			  or SSBaseRegister"
  			  self ssFlushTo: simStackPtr.
+ 			  immutabilityFailure := objectRepresentation
+ 										genImmutableCheck: ReceiverResultReg
+ 										slotIndex: slotIndex
+ 										sourceReg: ClassReg
+ 										scratchReg: TempReg
+ 										needRestoreRcvr: true ]
- 			  immutabilityFailure := objectRepresentation 
- 				genImmutableCheck: ReceiverResultReg 
- 				slotIndex: slotIndex 
- 				sourceReg: ClassReg 
- 				scratchReg: TempReg
- 				popBoolean: popBoolean
- 				needRestoreRcvr: true ]
  		ifFalse: 
  			[ topReg := self allocateRegForStackEntryAt: 0 notConflictingWith: (self registerMaskFor: ReceiverResultReg). 
  			  self ssStorePop: popBoolean toReg: topReg ].
  	traceStores > 0 ifTrue: 
  		[ self MoveR: topReg R: TempReg.
  		self evaluateTrampolineCallBlock: [ self CallRT: ceTraceStoreTrampoline ] protectLinkRegIfNot: needsFrame ].
  	objectRepresentation
  		genStoreSourceReg: topReg
  		slotIndex: slotIndex
  		destReg: ReceiverResultReg
  		scratchReg: TempReg
  		inFrame: needsFrame
  		needsStoreCheck: needStoreCheck.
  	self cppIf: IMMUTABILITY ifTrue: [ immutabilityFailure jmpTarget: self Label ].
  	^ 0!

Item was changed:
  ----- Method: StackToRegisterMappingCogit>>marshallAbsentReceiverSendArguments: (in category 'simulation stack') -----
  marshallAbsentReceiverSendArguments: numArgs
  	self assert: needsFrame.
  	self ssAllocateCallReg: ReceiverResultReg.
+ 	self putSelfInReceiverResultReg.
- 	self MoveMw: FoxMFReceiver r: FPReg R: ReceiverResultReg.
  
  	"Spill everything on the simulated stack that needs spilling (that below arguments).
  	 Marshall arguments to stack and/or registers depending on arg count.
  	 If the args don't fit in registers push receiver and args (spill everything).  Assume
  	 receiver already in ResultReceiverReg so shuffle args and push it if necessary."
  	self ssFlushTo: simStackPtr - numArgs.
  	numArgs > self numRegArgs
  		ifTrue:
  			["The arguments must be pushed to the stack, and hence the receiver
  			   must be inserted beneath the args.  Reduce or eliminate the argument
  			   shuffle by only moving already spilled items."
  			| numSpilled |
  			numSpilled := self numberOfSpillsInTopNItems: numArgs.
  			numSpilled > 0
  				ifTrue:
  					[self MoveMw: 0 r: SPReg R: TempReg.
  					 self PushR: TempReg.
  					 2 to: numSpilled do:
  						[:index|
  						self MoveMw: index * objectMemory wordSize r: SPReg R: TempReg.
  						self MoveR: TempReg Mw: index - 1 * objectMemory wordSize r: SPReg].
  					 self MoveR: ReceiverResultReg Mw: numSpilled * objectMemory wordSize r: SPReg]
  				ifFalse:
  					[self PushR: ReceiverResultReg].
  			self ssFlushTo: simStackPtr]
  		"Move the args to the register arguments, being careful to do
  		 so last to first so e.g. previous contents don't get overwritten.
  		 Also check for any arg registers in use by other args."
  		ifFalse:
  			[numArgs > 0 ifTrue:
  				[(self numRegArgs > 1 and: [numArgs > 1])
  					ifTrue:
  						[self ssAllocateRequiredReg: Arg0Reg upThrough: simStackPtr - 2.
  						 self ssAllocateRequiredReg: Arg1Reg upThrough: simStackPtr - 1]
  					ifFalse:
  						[self ssAllocateRequiredReg: Arg0Reg upThrough: simStackPtr - 1]].
  			 (self numRegArgs > 1 and: [numArgs > 1]) ifTrue:
  				[(self simStackAt: simStackPtr) popToReg: Arg1Reg].
  			 numArgs > 0 ifTrue:
  				[(self simStackAt: simStackPtr - numArgs + 1)
  					popToReg: Arg0Reg]].
  	self ssPop: numArgs!



More information about the Vm-dev mailing list