[Vm-dev] VM Maker: VMMaker.oscog-eem.890.mcz

commits at source.squeak.org commits at source.squeak.org
Fri Oct 3 16:23:09 UTC 2014


Eliot Miranda uploaded a new version of VMMaker to project VM Maker:
http://source.squeak.org/VMMaker/VMMaker.oscog-eem.890.mcz

==================== Summary ====================

Name: VMMaker.oscog-eem.890
Author: eem
Time: 3 October 2014, 9:20:31.152 am
UUID: 4cec861e-a7df-45b3-8298-ecb127ee6703
Ancestors: VMMaker.oscog-eem.889

Spur:
Fix one-way become for classes with and without
copyHash, primarily by fixing allInstances.  One-way
become for classes causes duplicates in the class
table (either that or an allInstances scan would be
needed as part of the become to change instances
referring to the deleted class index, which would be
very slow).  So allInstances must be able to cope
with duplicates.  Hence split it into a fast path
common case when the class in question is not
duplicated, and a slower path when it is.  Make both
the marking phase of GC and allInstances check for
and eliminate refereces to duplicate entries at
wrong/obsolete class indices.

Fix markAndShouldScan: to not scan the pun classes
of non-objects on the heap such as implicit receiver
caches and Sista counters.

Eliminate the classTableBitmap premature
optimization.  All the information we need is in the
cassTable and the class's hashes therein.

Document the failure return of
allocateSlotsInOldSpace:bytes:format:classIndex:

Change pinObject: to answer 0 on failure and the
(possibly moved) object on success.  Much easier
than having to check and follow forwarding pointer.

Simulator:
Eliminate the halt in Spur globalGarbageCollect,
the lemming approach raising an error in the copy
being much more useful.

Fix FakeStdinStream and FilePluginSimulator>>
sqFile:Read:Into:At: for buffer sizes > 1.

misc:
fix some C compilation warnings

=============== Diff against VMMaker.oscog-eem.889 ===============

Item was changed:
  ----- Method: Cogit>>unlinkIfForwardedSend:pc:ignored: (in category 'in-line cacheing') -----
  unlinkIfForwardedSend: annotation pc: mcpc ignored: superfluity
  	<var: #mcpc type: #'char *'>
  	| entryPoint cacheAddress |
  	(self isSendAnnotation: annotation) ifTrue:
  		[entryPoint := backEnd callTargetFromReturnAddress: mcpc asInteger.
  		 entryPoint > methodZoneBase
  			ifTrue: "It's a linked send, but maybe a super send or linked to an OpenPIC, in which case the cache tag will be a selector...."
+ 				[(objectMemory isForwardedClassIndex: (backEnd inlineCacheTagAt: mcpc asInteger)) ifTrue:
- 				[(objectMemory isForwardedClassIndex: (backEnd inlineCacheTagAt: mcpc)) ifTrue:
  					[self targetMethodAndSendTableFor: entryPoint into:
  						[:targetMethod :sendTable|
  						 self unlinkSendAt: mcpc targetMethod: targetMethod sendTable: sendTable]]]
  			ifFalse:
  				[self cppIf: NewspeakVM ifTrue:
  					[entryPoint = ceImplicitReceiverTrampoline ifTrue:
  						[self assert: NumOopsPerIRC = 2.
  						 cacheAddress := self implicitReceiverCacheAddressAt: mcpc.
  						 ((objectMemory isForwardedClassIndex: (backEnd unalignedLongAt: cacheAddress))
  						 or: [objectMemory isForwardedClassIndex: (backEnd unalignedLongAt: cacheAddress + BytesPerOop)]) ifTrue:
  							[self voidImplicitReceiverCacheAt: mcpc]]]]].
  	^0 "keep scanning"!

Item was added:
+ ----- Method: FakeStdinStream>>atEnd: (in category 'accessing') -----
+ atEnd: aBoolean
+ 	atEnd := aBoolean!

Item was changed:
  ----- Method: FakeStdinStream>>next (in category 'accessing') -----
  next
  	"Answer the next object in the Stream represented by the receiver.
  	 If there are no more elements in the stream fill up the buffer by prompting for input"
+ 	| sem threadIndex inputLine next |
- 	| sem threadIndex inputLine |
  	position >= readLimit ifTrue:
  		[simulator isThreadedVM
  			ifTrue:
  				["(simulator cogit singleStep not
  				  and: [UIManager confirm: 'Single step?']) ifTrue:
  					[simulator cogit singleStep: true]."
  				 threadIndex := simulator disownVM: DisownVMLockOutFullGC.
  				 simulator forceInterruptCheckFromHeartbeat.
  				 sem := Semaphore new.
  				 WorldState addDeferredUIMessage:
  					[inputLine := UIManager default request: 'Input please!!'.
  					 sem signal].
  				 sem wait]
  			ifFalse:
  				[inputLine := UIManager default request: 'Input please!!'].
  		 collection size <= inputLine size ifTrue:
  			[collection := collection species new: inputLine size + 1].
  		 collection
  			replaceFrom: 1 to: inputLine size with: inputLine startingAt: 1;
  		 	at: (readLimit := inputLine size + 1) put: Character lf.
  		 position := 0.
  		 simulator isThreadedVM ifTrue:
  			[simulator ownVM: threadIndex]].
+ 	next := collection at: (position := position + 1).
+ 	"This is set temporarily to allow (FilePluginSimulator>>#sqFile:Read:Into:At:
+ 	 to brwak out of its loop.  sqFile:Read:Into:At: resets it on the way out."
+ 	atEnd := position >= readLimit.
+ 	^next
- 	^collection at: (position := position + 1)
  	
  
  " This does it with workspaces:
  | ws r s |
  s := Semaphore new.
  ws := Workspace new contents: ''.
  ws acceptAction: [:t| r := t asString. s signal].
  [ws openLabel: 'Yo!!'; shouldStyle: false.
  (ws dependents detect: [:dep | dep isKindOf: PluggableTextMorph] ifNone: [nil]) ifNotNil:
  	[:textMorph| textMorph acceptOnCR: true; hasUnacceptedEdits: true]] fork.
  Processor activeProcess ==  Project uiProcess
  	ifTrue: [[r isNil] whileTrue: [World doOneCycle]]
  	ifFalse: [s wait].
  ws topView delete.
  s wait. s signal.
  r"!

Item was changed:
  ----- Method: FilePluginSimulator>>sqFile:Read:Into:At: (in category 'simulation') -----
  sqFile: file Read: count Into: byteArrayIndex At: startIndex
  	| interpreter |
  	interpreter := interpreterProxy interpreter.
  	[[startIndex to: startIndex + count - 1 do:
  		[ :i |
+ 		file atEnd ifTrue:
+ 			[(file isKindOf: FakeStdinStream) ifTrue: [file atEnd: false].
+ 			 ^i - startIndex].
- 		file atEnd ifTrue: [^i - startIndex].
  		interpreter
  			byteAt: byteArrayIndex + i
  			put: file next asInteger]]
  			on: Error
  			do: [:ex|
  				(file isKindOf: TranscriptStream) ifFalse: [ex pass].
  				^0]]
  		ensure: [self recordStateOf: file].
  	^count!

Item was changed:
  ----- Method: InterpreterPrimitives>>primitivePin (in category 'memory space primitives') -----
  primitivePin
+ 	"Pin or unpin the receiver, i.e. make it immobile or mobile, based on the argument.
+ 	 Answer whether the object was already pinned. N.B. pinning does *not* prevent
+ 	 an object from being garbage collected."
+ 	| obj boolean wasPinned |
- 	"Pin or unpin the receiver, i.e. make it immobile or mobile.  Answer whether the object was
- 	 already pinned. N.B. pinning does *not* prevent an object from being garbage collected."
- 	| obj boolean wasPinned failure |
  	objectMemory hasSpurMemoryManagerAPI ifFalse:
  		[^self primitiveFailFor: PrimErrUnsupported].
  
  	obj := self stackValue: 1.
  	((objectMemory isImmediate: obj)
  	 or: [(objectMemory isForwarded: obj)
  	 or: [(objectMemory isContext: obj)
  		and: [self isStillMarriedContext: obj]]]) ifTrue:
  		[^self primitiveFailFor: PrimErrBadReceiver].
  	boolean := self stackTop.
  	(boolean = objectMemory falseObject
  	 or: [boolean = objectMemory trueObject]) ifFalse:
  		[^self primitiveFailFor: PrimErrBadArgument].
  
  	(objectMemory isPinned: obj)
  		ifTrue:
  			[wasPinned := objectMemory trueObject.
+ 			 objectMemory setIsPinnedOf: obj to: boolean = objectMemory trueObject]
- 			 objectMemory setIsPinnedOf: obj to: false]
  		ifFalse:
  			[wasPinned := objectMemory falseObject.
+ 			 (boolean = objectMemory trueObject
+ 			  and: [objectMemory pinObject: obj]) = 0 ifTrue:
+ 				[^self primitiveFailFor: PrimErrNoMemory]].
- 			 failure := objectMemory pinObject: obj.
- 			 failure ~= 0 ifTrue:
- 				[^self primitiveFailFor: failure]].
  	
  	self pop: argumentCount - 1 thenPush: wasPinned!

Item was changed:
  ----- Method: Spur32BitCoMemoryManager>>copyObj:toSegment:addr:stopAt:saveOopAt: (in category 'image segment in/out') -----
  copyObj: objOop toSegment: segmentWordArray addr: limitSeg stopAt: stopAddr saveOopAt: oopPtr
  	"Copy objOop into the segment beginning at limitSeg, and forward it to the copy.
  	 Fail if out of space.  Answer the next segmentAddr if successful."
  
  	"Copy the object..."
  	| bodySize copy |
  	<inline: false>
  	bodySize := self bytesInObject: objOop.
  	(self oop: limitSeg + bodySize isGreaterThanOrEqualTo: stopAddr) ifTrue:
  		[^0]. "failure"
+ 	self mem: limitSeg asVoidPointer cp: (self startOfObject: objOop) asVoidPointer y: bodySize.
- 	self mem: limitSeg cp: (self startOfObject: objOop) y: bodySize.
  	copy := self objectStartingAt: limitSeg.
  
  	"Clear remebered pinned and mark bits of all headers copied into the segment"
  	self
  		setIsRememberedOf: copy to: false;
  		setIsPinnedOf: copy to: false;
  		setIsMarkedOf: copy to: false.
  
  	"Make sure Cogged methods have their true header field written to the segment."
  	((self isCompiledMethod: objOop)
  	and: [coInterpreter methodHasCogMethod: objOop]) ifTrue:
  		[self storePointerUnchecked: HeaderIndex
  			ofObject: copy
  			withValue: (self methodHeaderOf: objOop)].
  
  	"Remember the oop for undoing in case of prim failure."
  	self longAt: oopPtr put: objOop.	
  	self forward: objOop to: copy.
  
  	"Return new end of segment"
  	^limitSeg + bodySize!

Item was changed:
  Spur32BitCoMemoryManager subclass: #Spur32BitMMLECoSimulator
+ 	instanceVariableNames: 'parent bootstrapping'
- 	instanceVariableNames: 'bootstrapping'
  	classVariableNames: ''
  	poolDictionaries: ''
  	category: 'VMMaker-SpurMemoryManagerSimulation'!

Item was added:
+ ----- Method: Spur32BitMMLECoSimulator>>globalGarbageCollect (in category 'gc - global') -----
+ globalGarbageCollect
+ 	"If we're /not/ a clone, clone the VM and push it over the cliff.
+ 	 If it survives, destroy the clone and continue.  We should be OK until next time."
+ 	parent ifNil:
+ 		[coInterpreter cr; print: 'GC number '; print: statFullGCs; tab; flush.
+ 		 coInterpreter cloneSimulation objectMemory globalGarbageCollect.
+ 		 Smalltalk garbageCollect].
+ 	^super globalGarbageCollect!

Item was added:
+ ----- Method: Spur32BitMMLECoSimulator>>parent (in category 'accessing') -----
+ parent
+ 
+ 	^ parent!

Item was added:
+ ----- Method: Spur32BitMMLECoSimulator>>parent: (in category 'accessing') -----
+ parent: anObject
+ 
+ 	parent := anObject!

Item was changed:
  ----- Method: Spur32BitMMLESimulator>>globalGarbageCollect (in category 'gc - global') -----
  globalGarbageCollect
  	"If we're /not/ a clone, clone the VM and push it over the cliff.
  	 If it survives, destroy the clone and continue.  We should be OK until next time."
  	parent ifNil:
+ 		[coInterpreter cr; print: 'GC number '; print: statFullGCs; tab; flush.
+ 		 coInterpreter cloneSimulation objectMemory globalGarbageCollect.
- 		[[coInterpreter cloneSimulation objectMemory globalGarbageCollect]
- 			on: Halt
- 			do: [:ex|
- 				(ex messageText beginsWith: 'GC number')
- 					ifTrue:
- 						[Transcript cr; cr; show: ex messageText; cr; cr.
- 						 ex resume]
- 					ifFalse: [ex pass]].
  		 Smalltalk garbageCollect].
  	^super globalGarbageCollect!

Item was changed:
  ----- Method: Spur32BitMemoryManager>>allocateSlotsInOldSpace:bytes:format:classIndex: (in category 'allocation') -----
  allocateSlotsInOldSpace: numSlots bytes: totalBytes format: formatField classIndex: classIndex
  	"Answer the oop of a chunk of space in oldSpace with numSlots slots.  The header
+ 	 will have been filled-in but not the contents.  If no memory is available answer nil."
- 	 will have been filled-in but not the contents."
  	<var: #totalBytes type: #usqInt>
  	<inline: false>
  	| chunk |
  	chunk := self allocateOldSpaceChunkOfBytes: totalBytes.
  	self checkFreeSpace.
  	chunk ifNil:
  		[^nil].
  	numSlots >= self numSlotsMask ifTrue: "for header parsing we put a saturated slot count in the prepended overflow size word"
  		[self flag: #endianness.
  		 self longAt: chunk put: numSlots.
  		 self longAt: chunk + 4 put: self numSlotsMask << self numSlotsHalfShift.
  		 self long64At: chunk + self baseHeaderSize
  			put: (self headerForSlots: self numSlotsMask format: formatField classIndex: classIndex).
  		 ^chunk + self baseHeaderSize].
  	self long64At: chunk put: (self headerForSlots: numSlots format: formatField classIndex: classIndex).
  	^chunk!

Item was changed:
  ----- Method: Spur64BitMemoryManager>>allocateSlotsInOldSpace:bytes:format:classIndex: (in category 'allocation') -----
  allocateSlotsInOldSpace: numSlots bytes: totalBytes format: formatField classIndex: classIndex
  	"Answer the oop of a chunk of space in oldSpace with numSlots slots.  The header
+ 	 will have been filled-in but not the contents.  If no memory is available answer nil."
- 	 will have been filled-in but not the contents."
  	<var: #totalBytes type: #usqInt>
  	<inline: false>
  	| chunk |
  	chunk := self allocateOldSpaceChunkOfBytes: totalBytes.
  	self checkFreeSpace.
  	chunk ifNil:
  		[^nil].
  	numSlots >= self numSlotsMask ifTrue: "for header parsing we put a saturated slot count in the prepended overflow size word"
  		[self longAt: chunk
  			put: numSlots + (self numSlotsMask << self numSlotsFullShift).
  		 self longAt: chunk + self baseHeaderSize
  			put: (self headerForSlots: self numSlotsMask format: formatField classIndex: classIndex).
  		 ^chunk + self baseHeaderSize].
  	self longAt: chunk
  		put: (self headerForSlots: numSlots format: formatField classIndex: classIndex).
  	^chunk!

Item was changed:
  CogClass subclass: #SpurMemoryManager
(excessive size, no diff calculated)

Item was changed:
  ----- Method: SpurMemoryManager class>>declareCVarsIn: (in category 'translation') -----
  declareCVarsIn: aCCodeGenerator
  	self declareCAsOop: #(	memory freeStart scavengeThreshold newSpaceStart newSpaceLimit pastSpaceStart
  							lowSpaceThreshold freeOldSpaceStart oldSpaceStart endOfMemory firstFreeChunk lastFreeChunk)
  		in: aCCodeGenerator.
  	self declareCAsUSqLong: (self allInstVarNames select: [:ivn| ivn endsWith: 'Usecs'])
  		in: aCCodeGenerator.
  	aCCodeGenerator
  		var: #freeListsMask type: #usqInt;
  		var: #freeLists type: #'sqInt *';
- 		var: #classTableBitmap type: #'unsigned char *';
  		var: #objStackInvalidBecause type: #'char *';
  		var: #highestObjects type: #SpurCircularBuffer;
  		var: #unscannedEphemerons type: #SpurContiguousObjStack;
  		var: #heapGrowthToSizeGCRatio type: #float;
  		var: #heapSizeAtPreviousGC type: #usqInt;
  		var: #totalFreeOldSpace type: #usqInt.
  	aCCodeGenerator
  		var: #remapBuffer
  		declareC: 'sqInt remapBuffer[RemapBufferSize + 1 /* ', (RemapBufferSize + 1) printString, ' */]'.
  	aCCodeGenerator
  		var: #extraRoots
  		declareC: 'sqInt *extraRoots[ExtraRootsSize + 1 /* ', (ExtraRootsSize + 1) printString, ' */]'!

Item was changed:
  ----- Method: SpurMemoryManager>>adjustAllOopsBy: (in category 'snapshot') -----
  adjustAllOopsBy: bytesToShift
  	"Adjust all oop references by the given number of bytes. This is
  	 done just after reading in an image when the new base address
  	 of the object heap is different from the base address in the image,
  	 or when loading multiple segments that have been coalesced.  Also
  	 set bits in the classTableBitmap corresponding to used classes."
  
  	| obj classIndex |
+ 	self assert: self newSpaceIsEmpty.
+ 	self countNumClassPagesPreSwizzle: bytesToShift.
- 	self countNumClassPagesPreSwizzle: bytesToShift;
- 		ensureAdequateClassTableBitmap.
  	(bytesToShift ~= 0
+ 	 or: [segmentManager numSegments > 1]) ifTrue:
+ 		[obj := self objectStartingAt: oldSpaceStart.
+ 		 [self oop: obj isLessThan: freeOldSpaceStart] whileTrue:
+ 			[classIndex := self classIndexOf: obj.
+ 			 classIndex >= self isForwardedObjectClassIndexPun
+ 				ifTrue:
+ 					[self swizzleFieldsOfObject: obj]
+ 				ifFalse:
+ 					[classIndex = self isFreeObjectClassIndexPun ifTrue:
+ 						[self swizzleFieldsOfFreeChunk: obj]].
+ 			 obj := self objectAfter: obj]]!
- 	 or: [segmentManager numSegments > 1])
- 		ifTrue:
- 			[self assert: self newSpaceIsEmpty.
- 			 obj := self objectStartingAt: oldSpaceStart.
- 			 [self oop: obj isLessThan: freeOldSpaceStart] whileTrue:
- 				[classIndex := self classIndexOf: obj.
- 				 classIndex >= self isForwardedObjectClassIndexPun
- 					ifTrue:
- 						[classIndex > self lastClassIndexPun ifTrue:
- 							[self inClassTableBitmapSet: classIndex].
- 						 self swizzleFieldsOfObject: obj]
- 					ifFalse:
- 						[classIndex = self isFreeObjectClassIndexPun ifTrue:
- 							[self swizzleFieldsOfFreeChunk: obj]].
- 				 obj := self objectAfter: obj]]
- 		ifFalse:
- 			[self assert: self newSpaceIsEmpty.
- 			 obj := self objectStartingAt: oldSpaceStart.
- 			 [self oop: obj isLessThan: endOfMemory] whileTrue:
- 				[classIndex := self classIndexOf: obj.
- 				 classIndex > self lastClassIndexPun ifTrue:
- 					[self inClassTableBitmapSet: classIndex].
- 				 obj := self objectAfter: obj]]!

Item was changed:
  ----- Method: SpurMemoryManager>>allInstancesOf: (in category 'primitive support') -----
  allInstancesOf: aClass
  	"Attempt to answer an array of all objects, excluding those that may
  	 be garbage collected as a side effect of allocating the result array.
  	 If no memory is available answer the number of instances as a SmallInteger.
  	 Since objects are at least 16 bytes big, and the largest SmallInteger covers
  	 1/4 of the address space, the count can never overflow."
  	| classIndex freeChunk ptr start limit count bytes |
  	classIndex := self rawHashBitsOf: aClass.
+ 	classIndex = 0 ifTrue:
- 	(classIndex = 0
- 	 or: [aClass ~~ (self classOrNilAtIndex: classIndex)]) ifTrue:
  		[freeChunk := self allocateSlots: 0 format: self arrayFormat classIndex: ClassArrayCompactIndex.
  		 ^freeChunk].
  	MarkObjectsForEnumerationPrimitives ifTrue:
  		[self markObjects]. "may not want to revive objects unnecessarily; but marking is sloooow."
  	freeChunk := self allocateLargestFreeChunk.
+ 	start := freeChunk + self baseHeaderSize.
- 	ptr := start := freeChunk + self baseHeaderSize.
  	limit := self addressAfter: freeChunk.
+ 	(self isClassAtUniqueIndex: aClass)
+ 		ifTrue:
+ 			[self uniqueIndex: classIndex allInstancesInto: start limit: limit resultsInto: [:c :p| count := c. ptr := p]]
+ 		ifFalse:
+ 			[self ambiguousClass: aClass allInstancesInto: start limit: limit resultsInto: [:c :p| count := c. ptr := p]].
- 	count := 0.
- 	self allHeapEntitiesDo:
- 		[:obj| "continue enumerating even if no room so as to unmark all objects."
- 		 (MarkObjectsForEnumerationPrimitives
- 				ifTrue: [self isMarked: obj]
- 				ifFalse: [true]) ifTrue:
- 			[(self isNormalObject: obj)
- 				ifTrue:
- 					[MarkObjectsForEnumerationPrimitives ifTrue:
- 						[self setIsMarkedOf: obj to: false].
- 					 (self classIndexOf: obj) = classIndex ifTrue:
- 					 	[count := count + 1.
- 						 ptr < limit ifTrue:
- 							[self longAt: ptr put: obj.
- 							 ptr := ptr + self bytesPerSlot]]]
- 				ifFalse:
- 					[MarkObjectsForEnumerationPrimitives ifTrue:
- 						[(self isSegmentBridge: obj) ifFalse:
- 							[self setIsMarkedOf: obj to: false]]]]].
  	self assert: (self isEmptyObjStack: markStack).
  	MarkObjectsForEnumerationPrimitives
  		ifTrue:
  			[self assert: self allObjectsUnmarked.
  			 self emptyObjStack: weaklingStack]
  		ifFalse:
  			[self assert: (self isEmptyObjStack: weaklingStack)].
  	(count > (ptr - start / self bytesPerSlot) "not enough room"
  	 or: [limit ~= ptr and: [limit - ptr <= self allocationUnit]]) ifTrue: "can't split a single word"
  		[self freeObject: freeChunk.
  		 ^self integerObjectOf: count].
  	count < self numSlotsMask ifTrue:
  		[| smallObj |
  		 smallObj := self allocateSlots: count format: self arrayFormat classIndex: ClassArrayCompactIndex.
  		 0 to: count - 1 do:
  			[:i|
  			self storePointerUnchecked: i ofObject: smallObj withValue: (self fetchPointer: i ofFreeChunk: freeChunk)].
  		 self freeChunkWithBytes: (self bytesInObject: freeChunk) at: (self startOfObject: freeChunk).
  		 self beRootIfOld: smallObj.
  		 self checkFreeSpace.
  		 ^smallObj].
  	bytes := self largeObjectBytesForSlots: count.
  	start := self startOfObject: freeChunk.
  	self freeChunkWithBytes: limit - start - bytes at: start + bytes.
  	totalFreeOldSpace := totalFreeOldSpace - bytes.
  	self setOverflowNumSlotsOf: freeChunk to: count.
  	self set: freeChunk classIndexTo: ClassArrayCompactIndex formatTo: self arrayFormat.
  	self possibleRootStoreInto: freeChunk.
  	self checkFreeSpace.
  	self runLeakCheckerForFullGC: false.
  	^freeChunk
  	
  	!

Item was changed:
  ----- Method: SpurMemoryManager>>allocateLargestFreeChunk (in category 'free space') -----
  allocateLargestFreeChunk
  	"Answer the largest free chunk in the free lists."
+ 	<inline: false>
  	| freeChunk next |
  	"would like to use ifNotNil: but the ^next inside the ^blah ifNotNil: confused Slang"
  	freeChunk := self findLargestFreeChunk.
  	freeChunk ifNil: [^nil].
  	"This will be the node, not a list element.  Answer a list element in preference."
  	next := self fetchPointer: self freeChunkNextIndex ofFreeChunk: freeChunk.
  	next ~= 0 ifTrue:
+ 		[self storePointer: self freeChunkNextIndex
- 		[self storePointer:  self freeChunkNextIndex
  			ofFreeChunk: freeChunk
  			withValue: (self fetchPointer: self freeChunkNextIndex ofFreeChunk: next).
  		 ^next].
  	self unlinkSolitaryFreeTreeNode: freeChunk.
  	^freeChunk!

Item was changed:
  ----- Method: SpurMemoryManager>>allocateSlotsInOldSpace:bytes:format:classIndex: (in category 'allocation') -----
  allocateSlotsInOldSpace: numSlots bytes: totalBytes format: formatField classIndex: classIndex
  	"Answer the oop of a chunk of space in oldSpace with numSlots slots.  The header
+ 	 will have been filled-in but not the contents.  If no memory is available answer nil."
- 	 will have been filled-in but not the contents."
  	<var: #totalBytes type: #usqInt>
  	^self subclassResponsibility!

Item was added:
+ ----- Method: SpurMemoryManager>>ambiguousClass:allInstancesInto:limit:resultsInto: (in category 'primitive support') -----
+ ambiguousClass: aClass allInstancesInto: start limit: limit resultsInto: binaryBlock
+ 	"Dea with ambiguity and normalize indices."
+ 	<inline: true>
+ 	| expectedIndex count ptr |
+ 	count := 0.
+ 	ptr := start.
+ 	expectedIndex := self rawHashBitsOf: aClass.
+ 	self allHeapEntitiesDo:
+ 		[:obj| | actualIndex | "continue enumerating even if no room so as to unmark all objects and/or normalize class indices."
+ 		 (MarkObjectsForEnumerationPrimitives
+ 				ifTrue: [self isMarked: obj]
+ 				ifFalse: [true]) ifTrue:
+ 			[(self isNormalObject: obj)
+ 				ifTrue:
+ 					[MarkObjectsForEnumerationPrimitives ifTrue:
+ 						[self setIsMarkedOf: obj to: false].
+ 					 actualIndex := self classIndexOf: obj.
+ 					 (self classOrNilAtIndex: actualIndex) = aClass ifTrue:
+ 					 	[actualIndex ~= expectedIndex ifTrue:
+ 							[self setClassIndexOf: obj to: expectedIndex].
+ 						 count := count + 1.
+ 						 ptr < limit ifTrue:
+ 							[self longAt: ptr put: obj.
+ 							 ptr := ptr + self bytesPerSlot]]]
+ 				ifFalse:
+ 					[MarkObjectsForEnumerationPrimitives ifTrue:
+ 						[(self isSegmentBridge: obj) ifFalse:
+ 							[self setIsMarkedOf: obj to: false]]]]].
+ 	self purgeDuplicateClassTableEntriesFor: aClass.
+ 	binaryBlock value: count value: ptr
+ !

Item was changed:
  ----- Method: SpurMemoryManager>>copyObj:toSegment:addr:stopAt:saveOopAt: (in category 'image segment in/out') -----
  copyObj: objOop toSegment: segmentWordArray addr: limitSeg stopAt: stopAddr saveOopAt: oopPtr
  	"Copy objOop into the segment beginning at limitSeg, and forward it to the copy.
  	 Fail if out of space.  Answer the next segmentAddr if successful."
  
  	"Copy the object..."
  	| bodySize copy |
  	<inline: false>
  	bodySize := self bytesInObject: objOop.
  	(self oop: limitSeg + bodySize isGreaterThanOrEqualTo: stopAddr) ifTrue:
  		[^0]. "failure"
+ 	self mem: limitSeg asVoidPointer cp: (self startOfObject: objOop) asVoidPointer y: bodySize.
- 	self mem: limitSeg cp: (self startOfObject: objOop) y: bodySize.
  	copy := self objectStartingAt: limitSeg.
  
  	"Clear remebered pinned and mark bits of all headers copied into the segment"
  	self
  		setIsRememberedOf: copy to: false;
  		setIsPinnedOf: copy to: false;
  		setIsMarkedOf: copy to: false.
  
  	"Remember the oop for undoing in case of prim failure."
  	self longAt: oopPtr put: objOop.	
  	self forward: objOop to: copy.
  
  	"Return new end of segment"
  	^limitSeg + bodySize!

Item was changed:
  ----- Method: SpurMemoryManager>>doBecome:to:copyHash: (in category 'become implementation') -----
  doBecome: obj1 to: obj2 copyHash: copyHashFlag
+ 	"one-way become with or without copying obj1's hash into obj2.
+ 	 Straight-forward, even for classes.  With classes we end up with two entries
+ 	 for obj2.  Which is current depends on copyHashFlag.  If copyHashFlag is true
+ 	 then the entry at obj1's hash is valid, otherwise the the existing one at obj2's
+ 	 hash.  When all the instances with the old hash have been collected, the GC will
+ 	 discover this and expunge obj2 at the unused index (see markAndTraceClassOf:)."
- 	| o1HashBits o2HashBits |
- 	o1HashBits := self rawHashBitsOf: obj1.
- 	o2HashBits := self rawHashBitsOf: obj2.
  	self forward: obj1 to: obj2.
+ 	copyHashFlag ifTrue: [self setHashBitsOf: obj2 to: (self rawHashBitsOf: obj1)].
  	((self isOldObject: obj1)
  	 and: [self isYoungObject: obj2]) ifTrue:
  		[becomeEffectsFlags := becomeEffectsFlags bitOr: OldBecameNewFlag].
+ 	self deny: (self isForwarded: obj2)!
- 	copyHashFlag ifTrue: [self setHashBitsOf: obj2 to: o1HashBits].
- 	"obj1 is on its way out.  Remove it from the classTable"
- 	(o1HashBits ~= 0 and: [(self classAtIndex: o1HashBits) = obj1])
- 		ifTrue: [self expungeFromClassTable: obj1]
- 		ifFalse: [o1HashBits := 0]. "= 0 implies was not in class table"
- 	self deny: (self isForwarded: obj2).
- 	"o1HashBits ~= 0 implies obj1 was in class table and hence may have had instances.
- 	 Therefore o1HashBits needs to refer to obj2 (put obj2 in table at o1HashBits)."
- 	o1HashBits ~= 0 ifTrue:
- 		[o2HashBits = 0 ifTrue: "obj2 has no hash; we're free to assign one"
- 			[self setHashBitsOf: obj2 to: o1HashBits].
- 		 self classAtIndex: o1HashBits put: obj2]!

Item was removed:
- ----- Method: SpurMemoryManager>>ensureAdequateClassTableBitmap (in category 'class table') -----
- ensureAdequateClassTableBitmap
- 	"The classTableBitmap is used to reclaim unused and/or duplicate entries
- 	 in the classTable.  As such it is notionally 2^(22 - 3) bytes big, or 512k,
- 	 a little too large to be comfortable allocating statically (especially on small
- 	 machines).  So make it big enough for the max classTableIndex's base 2 ceiling."
- 	<inline: false>
- 	| requiredSize |
- 	requiredSize := (1 << numClassTablePages highBit)
- 					* (self classTablePageSize / BitsPerByte).
- 	self cCode:
- 			[classTableBitmap ifNotNil:
- 				[self free: classTableBitmap].
- 			 classTableBitmap := self malloc: requiredSize.
- 			 classTableBitmap ifNil:
- 				[self error: 'could not allocate classTableBitmap'].
- 			 self me: classTableBitmap ms: 0 et: requiredSize]
- 		inSmalltalk:
- 			[classTableBitmap := CArrayAccessor on: (ByteArray new: requiredSize)]!

Item was changed:
  ----- Method: SpurMemoryManager>>enterIntoClassTable: (in category 'class table') -----
  enterIntoClassTable: aBehavior
  	"Enter aBehavior into the class table and answer 0.  Otherwise answer a primitive failure code."
  	<inline: false>
  	| initialMajorIndex majorIndex minorIndex page |
  	majorIndex := classTableIndex >> self classTableMajorIndexShift.
  	initialMajorIndex := majorIndex.
  	"classTableIndex should never index the first page; it's reserved for known classes"
  	self assert: initialMajorIndex > 0.
  	minorIndex := classTableIndex bitAnd: self classTableMinorIndexMask.
  
  	[page := self fetchPointer: majorIndex ofObject: hiddenRootsObj.
  	 page = nilObj ifTrue:
  		[page := self allocateSlotsInOldSpace: self classTablePageSize
  					format: self arrayFormat
  					classIndex: self arrayClassIndexPun.
  		 page ifNil:
  			[^PrimErrNoMemory].
  		 self fillObj: page numSlots: self classTablePageSize with: nilObj.
  		 self storePointer: majorIndex
  			ofObject: hiddenRootsObj
  			withValue: page.
  		 numClassTablePages := numClassTablePages + 1.
  		 minorIndex := 0].
  	 minorIndex to: self classTablePageSize - 1 do:
  		[:i|
  		(self fetchPointer: i ofObject: page) = nilObj ifTrue:
  			[classTableIndex := majorIndex << self classTableMajorIndexShift + i.
  			 self storePointer: i
  				ofObject: page
  				withValue: aBehavior.
  			 self setHashBitsOf: aBehavior to: classTableIndex.
  			 self assert: (self classAtIndex: (self rawHashBitsOf: aBehavior)) = aBehavior.
- 			 self ensureAdequateClassTableBitmap.
  			 ^0]].
  	 majorIndex := (majorIndex + 1 bitAnd: self classIndexMask) max: 1.
  	 majorIndex = initialMajorIndex ifTrue: "wrapped; table full"
  		[^PrimErrLimitExceeded]] repeat!

Item was changed:
  ----- Method: SpurMemoryManager>>expungeDuplicateAndUnmarkedClasses: (in category 'class table') -----
  expungeDuplicateAndUnmarkedClasses: expungeUnmarked
  	"Bits have been set in the classTableBitmap corresponding to
  	 used classes.  Any class in the class table that does not have a
  	 bit set has no instances with that class index.  However, becomeForward:
+ 	 can create duplicate entries, and these duplicate entries wont match their
+ 	 identityHash. So expunge duplicates by eliminating unmarked entries that
+ 	 don't occur at their identityHash."
- 	 can create duplicate entries, and these duplicate entries
- 		a) won't have a bit set on load (because there are no forwarders on load),
- 		b) wont match their identityHash.
- 	 So expunge duplicates by eliminating unmarked entries that don't occur at
- 	 their identityHash.
- 	 Further, any class in the table that is unmarked will also not have a bit set so
- 	 eliminate unmarked classes using the bitmap too."
  	1 to: numClassTablePages - 1 do: "Avoid expunging the puns by not scanning the 0th page."
  		[:i| | classTablePage |
- 		"optimize scan by only scanning bitmap in regions that have pages."
  		classTablePage := self fetchPointer: i ofObject: hiddenRootsObj.
+ 		 0 to: self classTablePageSize - 1 do:
+ 			[:j| | classOrNil classIndex |
+ 			 classOrNil := self fetchPointer: j ofObject: classTablePage.
+ 			 classIndex := i << self classTableMajorIndexShift + j.
+ 			 self assert: (classOrNil = nilObj or: [coInterpreter addressCouldBeClassObj: classOrNil]).
+ 			 "only remove a class if it is at a duplicate entry or it is unmarked and we're expunging unmarked classes."
+ 			 classOrNil = nilObj
+ 				ifTrue:
+ 					[classIndex < classTableIndex ifTrue:
+ 						[classTableIndex := classIndex]]
+ 				ifFalse:
+ 					[((expungeUnmarked and: [(self isMarked: classOrNil) not])
+ 					   or: [(self rawHashBitsOf: classOrNil) ~= classIndex]) ifTrue:
+ 						[self storePointerUnchecked: j
+ 							ofObject: classTablePage
+ 							withValue: nilObj.
+ 						 "but if it is marked, it should still be in the table at its correct index."
+ 						 self assert: ((expungeUnmarked and: [(self isMarked: classOrNil) not])
+ 									or: [(self classAtIndex: (self rawHashBitsOf: classOrNil)) = classOrNil]).
+ 						 "If the removed class is before the classTableIndex, set the
+ 						  classTableIndex to point to the empty slot so as to reuse it asap."
+ 						 classIndex < classTableIndex ifTrue:
+ 							[classTableIndex := classIndex]]]]]!
- 		classTablePage ~= nilObj ifTrue:
- 			[i << self classTableMajorIndexShift
- 				to: i << self classTableMajorIndexShift + self classTableMinorIndexMask
- 				by: 8
- 				do: [:majorBitIndex| | byteIndex byte classIndex classOrNil |
- 					"optimize scan by scanning a byte of indices (8 indices) at a time"
- 					byteIndex := majorBitIndex / BitsPerByte.
- 					byte := classTableBitmap at: byteIndex.
- 					byte ~= 255 ifTrue:
- 						[0 to: 7 do:
- 							[:minorBitIndex|
- 							(byte noMask: 1 << minorBitIndex) ifTrue:
- 								[classIndex := majorBitIndex + minorBitIndex.
- 								 classOrNil := self fetchPointer: (classIndex bitAnd: self classTableMinorIndexMask)
- 												   ofObject: classTablePage.
- 								 self assert: (self classAtIndex: classIndex) = classOrNil.
- 								 self assert: (classOrNil = nilObj or: [coInterpreter addressCouldBeClassObj: classOrNil]).
- 								 "only remove a class if it is at a duplicate entry or it is unmarked and we're expunging unmarked classes."
- 								 classOrNil = nilObj
- 									ifTrue:
- 										[classIndex < classTableIndex ifTrue:
- 											[classTableIndex := classIndex]]
- 									ifFalse:
- 										[((expungeUnmarked and: [(self isMarked: classOrNil) not])
- 										  or: [(self rawHashBitsOf: classOrNil) ~= classIndex]) ifTrue:
- 										[self storePointerUnchecked: (classIndex bitAnd: self classTableMinorIndexMask)
- 											ofObject: classTablePage
- 											withValue: nilObj.
- 										 "but if it is marked, it should still be in the table at its correct index."
- 										 self assert: ((expungeUnmarked and: [(self isMarked: classOrNil) not])
- 													or: [(self classAtIndex: (self rawHashBitsOf: classOrNil)) = classOrNil]).
- 										 "If the removed class is before the classTableIndex, set the
- 										  classTableIndex to point to the empty slot so as to reuse it asap."
- 										 classIndex < classTableIndex ifTrue:
- 											[classTableIndex := classIndex]]]]]]]]]!

Item was changed:
  ----- Method: SpurMemoryManager>>globalGarbageCollect (in category 'gc - global') -----
  globalGarbageCollect
  	<inline: true> "inline into fullGC"
- 	self cCode: [] inSmalltalk: [self halt: 'GC number ', statFullGCs printString].
- 
  	self assert: self validObjStacks.
  	self assert: (self isEmptyObjStack: markStack).
  	self assert: (self isEmptyObjStack: weaklingStack).
  
  	self markObjects.
  	self expungeDuplicateAndUnmarkedClasses: true.
  	self nilUnmarkedWeaklingSlots.
  	self freeUnmarkedObjectsAndSortAndCoalesceFreeSpaceForPigCompact.
  
  	"Mid-way the leak check must be more lenient.  Unmarked classes will have been
  	 expunged from the table, but unmarked instances will not yet have been reclaimed."
  	self runLeakCheckerForFullGC: true
  		excludeUnmarkedNewSpaceObjs: true
  		classIndicesShouldBeValid: true.
  
  	self compact.
  	self setHeapSizeAtPreviousGC.
  
  	self assert: self validObjStacks.
  	self assert: (self isEmptyObjStack: markStack).
  	self assert: (self isEmptyObjStack: weaklingStack).
  	self assert: self allObjectsUnmarked.
  	self runLeakCheckerForFullGC: true!

Item was removed:
- ----- Method: SpurMemoryManager>>inClassTableBitmapSet: (in category 'class table') -----
- inClassTableBitmapSet: classIndex
- 	| bit majorIndex |
- 	self assert: (classIndex >= self firstClassIndexPun
- 				 and: [classIndex <= self classIndexMask]).
- 	majorIndex := classIndex // BitsPerByte.
- 	bit := 1 << (classIndex bitAnd: BitsPerByte - 1).
- 	classTableBitmap
- 		at: majorIndex
- 		put: ((classTableBitmap at: majorIndex) bitOr: bit)!

Item was changed:
  ----- Method: SpurMemoryManager>>initializeObjectMemory: (in category 'initialization') -----
  initializeObjectMemory: bytesToShift
  	"Initialize object memory variables at startup time. Assume endOfMemory at al are
  	 initialised by the image-reading code via setHeapBase:memoryLimit:endOfMemory:.
  	 endOfMemory is assumed to point to the end of the last object in the image.
  	 Assume: image reader also initializes the following variables:
  		specialObjectsOop
  		lastHash"
  	<inline: false>
  	| freeListObj |
  	"Catch mis-initializations leading to bad translations to C"
  	self assert: BaseHeaderSize = self baseHeaderSize.
  	self assert: (self maxSlotsForAlloc * BytesPerWord) asInteger > 0.
  	self bootstrapping ifFalse:
  		[self
  			initSegmentBridgeWithBytes: self bridgeSize
  			at: endOfMemory - self bridgeSize].
  	segmentManager adjustSegmentSwizzlesBy: bytesToShift.
  	"image may be at a different address; adjust oops for new location"
  	self adjustAllOopsBy: bytesToShift.
  	specialObjectsOop := segmentManager swizzleObj: specialObjectsOop.
  
  	"heavily used special objects"
  	nilObj		:= self splObj: NilObject.
  	falseObj	:= self splObj: FalseObject.
  	trueObj		:= self splObj: TrueObject.
  
  	"In Cog we insist that nil, true & false are next to each other (Cogit generates tighter
  	 conditional branch code as a result).  In addition, Spur places the free lists and
  	 class table root page immediately following them."
  	self assert: nilObj = oldSpaceStart.
  	self assert: falseObj = (self objectAfter: nilObj).
  	self assert: trueObj = (self objectAfter: falseObj).
  	freeListObj := self objectAfter: trueObj.
+ 	self hiddenRootsObj: (self objectAfter: freeListObj).
- 	self reInitializeClassTablePostLoad: (self objectAfter: freeListObj).
  	markStack := self swizzleObjStackAt: MarkStackRootIndex.
  	weaklingStack := self swizzleObjStackAt: WeaklingStackRootIndex.
  	ephemeronQueue := self swizzleObjStackAt: EphemeronQueueRootIndex.
  	self assert: self validObjStacks.
  	self assert: (self isEmptyObjStack: markStack).
  	self assert: (self isEmptyObjStack: weaklingStack).
  
  	self initializeFreeSpacePostLoad: freeListObj.
  	segmentManager collapseSegmentsPostSwizzle.
  	self computeFreeSpacePostSwizzle.
  	self initializeOldSpaceFirstFree: freeOldSpaceStart. "initializes endOfMemory, freeStart, free space"
  	"self bootstrapping ifFalse:
  		["self initializeNewSpaceVariables.
  		 scavenger initializeRememberedSet"]".
  	segmentManager checkSegments.
  
  	numCompactionPasses := CompactionPassesForGC.
  
  	"These defaults should depend on machine size; e.g. too small on a powerful laptop, too big on a Pi."
  	growHeadroom := 16*1024*1024.		"headroom when growing"
  	shrinkThreshold := 32*1024*1024.		"free space before shrinking"
  	self setHeapSizeAtPreviousGC.
  	heapGrowthToSizeGCRatio := 0.333333. "By default GC after scavenge if heap has grown by a third since the last GC"!

Item was added:
+ ----- Method: SpurMemoryManager>>isClassAtUniqueIndex: (in category 'class table') -----
+ isClassAtUniqueIndex: aClass
+ 	"Answer if aClass exists at only one index in the class table."
+ 	| expectedIndex |
+ 	expectedIndex := self rawHashBitsOf: aClass.
+ 	self classTableEntriesDo:
+ 		[:entry :index|
+ 		 (entry = aClass and: [index ~= expectedIndex]) ifTrue:
+ 			[^false]].
+ 	^true!

Item was changed:
  ----- Method: SpurMemoryManager>>markAndShouldScan: (in category 'gc - global') -----
  markAndShouldScan: objOop
  	"Helper for markAndTrace:.
  	 Mark the argument, and answer if its fields should be scanned now.
  	 Immediate objects don't need to be marked.
  	 Already marked objects have already been processed.
  	 Pure bits objects don't need scanning, although their class does.
  	 Weak objects should be pushed on the weakling stack.
  	 Anything else need scanning."
  	| format |
  	<inline: true>
  	(self isImmediate: objOop) ifTrue:
  		[^false].
  	"if markAndTrace: is to follow and eliminate forwarding pointers
  	 in its scan it cannot be handed an r-value which is forwarded."
  	self assert: (self isForwarded: objOop) not.
  	(self isMarked: objOop) ifTrue:
  		[^false].
  	self setIsMarkedOf: objOop to: true.
  	format := self formatOf: objOop.
+ 	(self isPureBitsFormat: format) ifTrue: "avoid pushing non-pointer objects on the markStack."
+ 		["Avoid tracing classes of non-objects on the heap, e.g. IRC caches, Sista counters."
+ 		 (self classIndexOf: objOop) > self lastClassIndexPun ifTrue:
+ 			[self markAndTraceClassOf: objOop].
- 	(self isPureBitsFormat: format) ifTrue: "avoid pushing non-pointer objects on the markStack"
- 		[self markAndTraceClassOf: objOop.
  		 ^false].
  	format = self weakArrayFormat ifTrue: "push weaklings on the weakling stack to scan later"
  		[self push: objOop onObjStack: weaklingStack.
  		 ^false].
  	(format = self ephemeronFormat
  	 and: [self activeAndDeferredScan: objOop]) ifTrue:
  		[^false].
  	^true!

Item was changed:
  ----- Method: SpurMemoryManager>>markAndTraceClassOf: (in category 'gc - global') -----
  markAndTraceClassOf: objOop
  	"Ensure the class of the argument is marked, pushing it on the markStack if not already marked.
  	 Also set the relevant bit in the classTableBitmap so that duplicate entries can be eliminated.
+ 	 And for one-way become, which can create duplicate entries in the class table, make sure
+ 	 objOop's classIndex refers to the classObj's actual classIndex.
  	 Note that this is recursive, but the metaclass chain should terminate quickly."
  	<inline: false>
+ 	| classIndex classObj realClassIndex |
- 	| classIndex classObj |
  	classIndex := self classIndexOf: objOop.
- 	self inClassTableBitmapSet: classIndex.
  	classObj := self classOrNilAtIndex: classIndex.
+ 	realClassIndex := self rawHashBitsOf: classObj.
+ 	classIndex ~= realClassIndex ifTrue:
+ 		[self setClassIndexOf: objOop to: realClassIndex].
  	(self isMarked: classObj) ifFalse:
  		[self setIsMarkedOf: classObj to: true.
  		 self markAndTraceClassOf: classObj.
  		 self push: classObj onObjStack: markStack]!

Item was changed:
  ----- Method: SpurMemoryManager>>markObjects (in category 'gc - global') -----
  markObjects
  	<inline: #never> "for profiling"
  	"Mark all accessible objects."
  	"If the incremental collector is running mark bits may be set; stop it and clear them if necessary."
  	self cCode: '' inSmalltalk: [coInterpreter transcript nextPutAll: 'marking...'; flush].
  	self runLeakCheckerForFullGC: true.
  
  	self ensureAllMarkBitsAreZero.
- 	self ensureAdequateClassTableBitmap.
  	self initializeUnscannedEphemerons.
  	self initializeMarkStack.
  	self initializeWeaklingStack.
  	self markAccessibleObjects!

Item was changed:
  ----- Method: SpurMemoryManager>>pinObject: (in category 'primitive support') -----
  pinObject: objOop
+ 	"Attempt to pin objOop, which must not be immediate.
+ 	 If the attempt succeeds answer objOop's (possibly moved) oop.
+ 	 If the attept fails, which can only occur if there is no memory, answer 0."
+ 	<inline: false>
  	| oldClone seg |
  	<var: #seg type: #'SpurSegmentInfo *'>
  	self assert: (self isNonImmediate: objOop).
  	self flag: 'policy decision here. if already old, do we clone in a segment containing pinned objects or merely pin?'.
  	"We choose to clone to keep pinned objects together to reduce fragmentation,
  	 assuming that pinning is rare and that fragmentation is a bad thing."
  	(self isOldObject: objOop) ifTrue:
  		[seg := segmentManager segmentContainingObj: objOop.
  		 seg containsPinned ifTrue:
  			[self setIsPinnedOf: objOop to: true.
+ 			 ^objOop].
- 			 ^0].
  		 segmentManager someSegmentContainsPinned ifFalse:
  			[self setIsPinnedOf: objOop to: true.
  			 seg containsPinned: true.
+ 			 ^objOop]].
- 			 ^0]].
  	oldClone := self cloneInOldSpaceForPinning: objOop.
+ 	oldClone ~= 0 ifTrue:
+ 		[self setIsPinnedOf: oldClone to: true.
+ 		 self forward: objOop to: oldClone].
+ 	^oldClone!
- 	oldClone = 0 ifTrue:
- 		[^PrimErrNoMemory].
- 	self setIsPinnedOf: oldClone to: true.
- 	self forward: objOop to: oldClone.
- 	^0!

Item was added:
+ ----- Method: SpurMemoryManager>>purgeDuplicateClassTableEntriesFor: (in category 'class table') -----
+ purgeDuplicateClassTableEntriesFor: aClass
+ 	"Given that either marking or allnstances has ensured that
+ 	 all instances of aClass  have the class's hash as their class
+ 	 index, ensure aClass is in the table only at its hash."
+ 	| expectedIndex |
+ 	expectedIndex := self rawHashBitsOf: aClass.
+ 	self classTableEntriesDo:
+ 		[:entry :index|
+ 		 (entry = aClass and: [index ~= expectedIndex]) ifTrue:
+ 			[self classAtIndex: index put: nilObj.
+ 			 index < classTableIndex ifTrue:
+ 				[classTableIndex := index]]]!

Item was removed:
- ----- Method: SpurMemoryManager>>reInitializeClassTablePostLoad: (in category 'class table') -----
- reInitializeClassTablePostLoad: hiddenRoots
- 	self hiddenRootsObj: hiddenRoots.
- 	self expungeDuplicateAndUnmarkedClasses: false!

Item was added:
+ ----- Method: SpurMemoryManager>>uniqueIndex:allInstancesInto:limit:resultsInto: (in category 'primitive support') -----
+ uniqueIndex: classIndex allInstancesInto: start limit: limit resultsInto: binaryBlock
+ 	<inline: true>
+ 	| count ptr |
+ 	count := 0.
+ 	ptr := start.
+ 	self allHeapEntitiesDo:
+ 		[:obj| "continue enumerating even if no room so as to unmark all objects."
+ 		 (MarkObjectsForEnumerationPrimitives
+ 				ifTrue: [self isMarked: obj]
+ 				ifFalse: [true]) ifTrue:
+ 			[(self isNormalObject: obj)
+ 				ifTrue:
+ 					[MarkObjectsForEnumerationPrimitives ifTrue:
+ 						[self setIsMarkedOf: obj to: false].
+ 					 (self classIndexOf: obj) = classIndex ifTrue:
+ 					 	[count := count + 1.
+ 						 ptr < limit ifTrue:
+ 							[self longAt: ptr put: obj.
+ 							 ptr := ptr + self bytesPerSlot]]]
+ 				ifFalse:
+ 					[MarkObjectsForEnumerationPrimitives ifTrue:
+ 						[(self isSegmentBridge: obj) ifFalse:
+ 							[self setIsMarkedOf: obj to: false]]]]].
+ 	binaryBlock value: count value: ptr
+ !

Item was changed:
  ----- Method: StackInterpreter>>callPrimitiveBytecode (in category 'miscellaneous bytecodes') -----
  callPrimitiveBytecode
  	"V4:			249		11111001	i i i i i i i i	jjjjjjjj		Call Primitive #iiiiiiii + (jjjjjjjj * 256)
  	 SistaV1:	248		11111000 	iiiiiiii		mjjjjjjj		Call Primitive #iiiiiiii + (jjjjjjj * 256) m=1 means inlined primitive, no hard return after execution.
  	 V3/Spur:	139		10001011	i i i i i i i i	jjjjjjjj		Call Primitive #iiiiiiii + (jjjjjjjj * 256)"
  	"Note that we simply skip a callPrimitiveBytecode at the start of a method
  	 that contains a primitive.  This because methods like Context(Part)>>reset
  	 have to be updated to skip the callPrimtiive bytecode otherwise."
  	self cppIf: SistaVM
  		ifTrue:
  			[| byte1 byte2 prim header |
  			 byte1 := self fetchByte.
  			 byte2 := self fetchByte.
  			 self fetchNextBytecode.
  			 byte2 < 128 ifTrue:
  				[header := self methodHeaderOf: method.
  				 ((self methodHeaderHasPrimitive: header)
  				  and: [localIP = (self initialPCForHeader: header method: method) + (self sizeOfCallPrimitiveBytecode: header)]) ifTrue:
  					[^self].
  				 localIP := localIP - 3.
  				 ^self respondToUnknownBytecode].
  			 prim := byte2 - 128 << 8 + byte1.
  			 prim < 1000 ifTrue:
  				[^self nullaryInlinePrimitive: prim].
  
  			 prim < 2000 ifTrue:
  				[^self unaryInlinePrimitive: prim - 1000].
  				
  			 prim < 3000 ifTrue:
  				[^self binaryInlinePrimitive: prim - 2000].
  
  			 prim < 4000 ifTrue:
  				[^self trinaryInlinePrimitive: prim - 3000].
  
  			 localIP := localIP - 3.
  			 ^self respondToUnknownBytecode]
  		ifFalse:
  			[| header |
  			 header := self methodHeaderOf: method.
  			 ((self methodHeaderHasPrimitive: header)
+ 			  and: [localIP asInteger = (self initialPCForHeader: header method: method)])
- 			  and: [localIP = (self initialPCForHeader: header method: method)])
  				ifTrue:
  					[localIP := localIP + (self sizeOfCallPrimitiveBytecode: header).
  					 ^self fetchNextBytecode]
  				ifFalse:
  					[^self respondToUnknownBytecode]]!



More information about the Vm-dev mailing list