[Vm-dev] VM Maker: VMMaker.oscog.seperateMarking-WoC.3258.mcz

commits at source.squeak.org commits at source.squeak.org
Tue Sep 6 15:04:09 UTC 2022


Tom Braun uploaded a new version of VMMaker to project VM Maker:
http://source.squeak.org/VMMaker/VMMaker.oscog.seperateMarking-WoC.3258.mcz

==================== Summary ====================

Name: VMMaker.oscog.seperateMarking-WoC.3258
Author: WoC
Time: 6 September 2022, 5:03:41.73246 pm
UUID: 31c9ff6b-6e47-4e1d-a1fc-4b10d2a61365
Ancestors: VMMaker.oscog.seperateMarking-WoC.3257, VMMaker.oscog-eem.3252

- added some debugging helper
- consider young objects more during collection
- assert for (fogotten) umarked class before expunging them

=============== Diff against VMMaker.oscog.seperateMarking-WoC.3257 ===============

Item was removed:
- ----- Method: SpurAllAtOnceMarker>>markObjects: (in category 'marking') -----
- markObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
- 	<inline: #never> "for profiling"
- 	"Mark all accessible objects.  objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
- 	 is true if all objects are unmarked and/or if unmarked classes shoud be removed from the class table."
- 	"If the incremental collector is running mark bits may be set; stop it and clear them if necessary."
- 	self cCode: '' inSmalltalk: [coInterpreter transcript nextPutAll: 'marking...'; flush].
- 	manager runLeakCheckerFor: GCModeFull.
- 
- 	manager shutDownGlobalIncrementalGC: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged.
- 	manager initializeUnscannedEphemerons.
- 	manager initializeMarkStack.
- 	manager initializeWeaklingStack.
- 	marking := true.
- 	self markAccessibleObjectsAndFireEphemerons.
- 	manager expungeDuplicateAndUnmarkedClasses: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged.
- 	manager nilUnmarkedWeaklingSlots.
- 	marking := false!

Item was added:
+ ----- Method: SpurAllAtOnceMarker>>markersMarkObjects: (in category 'marking') -----
+ markersMarkObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
+ 	<inline: #never> "for profiling"
+ 	"Mark all accessible objects.  objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
+ 	 is true if all objects are unmarked and/or if unmarked classes shoud be removed from the class table."
+ 	"If the incremental collector is running mark bits may be set; stop it and clear them if necessary."
+ 	self cCode: '' inSmalltalk: [coInterpreter transcript nextPutAll: 'marking...'; flush].
+ 	manager runLeakCheckerFor: GCModeFull.
+ 
+ 	manager shutDownGlobalIncrementalGC: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged.
+ 	manager initializeUnscannedEphemerons.
+ 	manager initializeMarkStack.
+ 	manager initializeWeaklingStack.
+ 	marking := true.
+ 	self markAccessibleObjectsAndFireEphemerons.
+ 	manager expungeDuplicateAndUnmarkedClasses: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged.
+ 	manager nilUnmarkedWeaklingSlotsExcludingYoungObjects: false.
+ 	marking := false!

Item was removed:
- ----- Method: SpurGarbageCollector>>allocatorMarkBitToSet (in category 'as yet unclassified') -----
- allocatorMarkBitToSet
- 
- 	^ allocatorMarkBitToSet!

Item was added:
+ ----- Method: SpurGarbageCollector>>markObjects: (in category 'as yet unclassified') -----
+ markObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
+ 
+ 	marker markersMarkObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged!

Item was added:
+ ----- Method: SpurGarbageCollector>>maybeModifyCopiedObject: (in category 'object creation barriers') -----
+ maybeModifyCopiedObject: objOop
+ 
+ 	<doNotGenerate>!

Item was changed:
  ----- Method: SpurGenerationScavenger>>copyToOldSpace:bytes:format: (in category 'scavenger') -----
  copyToOldSpace: survivor bytes: bytesInObject format: formatOfSurvivor
  	"Copy survivor to oldSpace.  Answer the new oop of the object."
  	<inline: #never> "Should be too infrequent to lower icache density of copyAndForward:"
  	| nTenures startOfSurvivor newStart newOop |
  	self assert: (formatOfSurvivor = (manager formatOf: survivor)
  				and: [((manager isMarked: survivor) not or: [tenureCriterion = MarkOnTenure])
  				and: [tenureCriterion = TenureToShrinkRT
  					or: [(manager isPinned: survivor) not
  						and: [(manager isRemembered: survivor) not]]]]).
  	nTenures := statTenures.
  	startOfSurvivor := manager startOfObject: survivor.
  	newStart := manager allocateOldSpaceChunkOfBytes: bytesInObject.
  	newStart ifNil:
  		[manager growOldSpaceByAtLeast: 0. "grow by growHeadroom"
  		 newStart := manager allocateOldSpaceChunkOfBytes: bytesInObject.
  		 newStart ifNil:
  			[self error: 'out of memory']].
  	"manager checkFreeSpace."
  	manager memcpy: newStart asVoidPointer _: startOfSurvivor asVoidPointer _: bytesInObject.
  	newOop := newStart + (survivor - startOfSurvivor).
  	tenureCriterion >= (TenureToShrinkRT min: MarkOnTenure) ifTrue:
  		[tenureCriterion = TenureToShrinkRT ifTrue:
  			[manager rtRefCountOf: newOop put: 0].
  		 tenureCriterion = MarkOnTenure ifTrue:
+ 			[self flag: #Todo. "as we do this later on probably can delete branch"
+ 			manager setIsMarkedOf: newOop to: true]].
- 			[manager setIsMarkedOf: newOop to: true]].
  	
+ 	manager gc maybeModifyCopiedObject: newOop.
+ 	
- 	manager gc maybeModifyGCFlagsOf: newOop.
  	statTenures := nTenures + 1.
  	(manager isAnyPointerFormat: formatOfSurvivor) ifTrue:
  		["A very quick and dirty scan to find young referents.  If we misidentify bytes
  		  in a CompiledMethod as young we don't care; it's unlikely, and a subsequent
  		  scan of the rt will filter the object out.  But it's good to filter here because
  		  otherwise an attempt to shrink the RT may simply fill it up with new objects,
  		  and here the data is likely in the cache."
  		 manager baseHeaderSize to: bytesInObject - (survivor - startOfSurvivor) - manager wordSize by: manager wordSize do:
  			[:p| | field |
  			field := manager longAt: survivor + p.
  			(manager isReallyYoung: field) ifTrue:
  				[self remember: newOop.
  				 ^newOop]]].
  	^newOop!

Item was changed:
  ----- Method: SpurIncrementalCompactor>>initCompactionIfNecessary (in category 'incremental compaction') -----
  initCompactionIfNecessary
  
  	isCompacting
  		ifFalse: [self assertNoSegmentBeingCompacted.
  				self planCompactionAndReserveSpace.
  				
  				self assert: manager totalFreeOldSpace = manager totalFreeListBytes.
  				
  				shouldCompact ifTrue: [currentHeapPointer := segmentToFill segStart]].
  			
  	isCompacting := true.
  	
+ 	self assert: currentSegment = 0
- 	self assert: currentSegment notNil
  	!

Item was changed:
  ----- Method: SpurIncrementalGarbageCollector>>assertSettingGCFlagsIsOk: (in category 'as yet unclassified') -----
  assertSettingGCFlagsIsOk: objOop
  
  	"do not color young objects. They have an extra state we do not want to change"
  	self assert: (manager isOldObject: objOop).
  	
  	"while sweeping: do not color objects behind the currently point the sweeper is at. This would infer with the next marking pass"
+ 	self assert: (self allocatorShouldAllocateBlack not or: [objOop >= self compactor currentSweepingEntity]).
+ 	
+ 	(self allocatorShouldAllocateBlack not or: [objOop >= self compactor currentSweepingEntity])
+ 		ifFalse: [self cCode: 'error("foo")'.]!
- 	self assert: (self allocatorShouldAllocateBlack not or: [objOop >= self compactor currentSweepingEntity])!

Item was changed:
  ----- Method: SpurIncrementalGarbageCollector>>doIncrementalCollect (in category 'as yet unclassified') -----
  doIncrementalCollect
  	
+ 	manager statScavenges = 9300 ifTrue: [self halt].
  	phase = InMarkingPhase
+ 		ifTrue: [ | finishedMarking |
- 		ifTrue: [
  			coInterpreter cr; print: 'start marking '; tab; flush.
+ 			finishedMarking := marker incrementalMarkObjects.
+ 			
+ 			"self assert: manager validObjectColors."
+ 			
+ 			finishedMarking
- 			marker incrementalMarkObjects
  				ifTrue: [
  					manager allPastSpaceObjectsDo: [:obj | self assert: (manager isWhite: obj)].
- 					manager allOldSpaceObjectsDo: [:ea | (manager isForwarded: ea) ifTrue: [self halt] ].
  					
  					"when sweeping the mutator needs to allocate new objects black as we do not have any information about them.
  					We only know if they should get swept after the next marking -> keep them alive for this cycle"
  					self allocatorShouldAllocateBlack: true.
  					compactor setInitialSweepingEntity.
  					phase := InSweepingPhase.
  					
  					"marking is done and thus all forwarding references are resolved -> we can use the now free segments that were 
  					compacted during the last cycle"
  					compactor freePastSegmentsAndSetSegmentToFill.
  					
  					coInterpreter cr; print: 'finish marking '; tab; flush.
  					
  					manager 
  						setCheckForLeaks: GCCheckFreeSpace + GCModeFull;
  						runLeakCheckerFor: GCModeFull;
  						checkFreeSpace: GCModeFull.
  						
  					
  					^ self]
  				ifFalse: [coInterpreter cr; print: 'finish marking pass'; tab; flush.manager runLeakCheckerFor: GCModeIncremental]].
  		
  	phase = InSweepingPhase
  		ifTrue: [
  			coInterpreter cr; print: 'start sweeping '; tab; flush.
  			compactor incrementalSweep
  				ifTrue: [
  					self allocatorShouldAllocateBlack: false.
  					manager allOldSpaceObjectsDo: [:ea | self assert: (manager isWhite: ea) ].
  					"self assert: manager allObjectsUnmarked."
  					
  					coInterpreter cr; print: 'finish sweeping '; tab; flush.
  					
  					manager 
  						setCheckForLeaks: GCCheckFreeSpace + GCModeFull;
  						runLeakCheckerFor: GCModeFull;
  						checkFreeSpace: GCModeFull.
  					
  					phase := InCompactingPhase.
  					^ self]].
  		
  	phase = InCompactingPhase
  		ifTrue: [
  			coInterpreter cr; print: 'start compacting '; tab; flush.
  			compactor incrementalCompact
  				ifTrue: [
  					coInterpreter cr; print: 'finish compacting '; tab; flush.
  					manager 
  						setCheckForLeaks: GCCheckFreeSpace + GCModeFull;
  						runLeakCheckerFor: GCModeFull;
  						checkFreeSpace: GCModeFull.
  					
  					phase := InMarkingPhase.
+ 					
  					^ self]]!

Item was changed:
  ----- Method: SpurIncrementalGarbageCollector>>fullGC (in category 'global') -----
  fullGC
  	"We need to be able to make a full GC, e.g. when we save the image. Use the made progress and finish the collection"
  	
  	"incredible hacky solution. Will later on be replaced with the old collection, but for now use this to keep the state transitions consistent"
  	
  	self assert: manager validObjStacks.
+ 	
- 	self halt.
  	coInterpreter cr; print: 'start fullGC '; tab; flush.
  	
  	coInterpreter setGCMode: GCModeNewSpace.
  	self doScavengeWithoutIncrementalCollect: MarkOnTenure.
  	
+ 	coInterpreter setGCMode: GCModeIncremental.
  	phase = InMarkingPhase
  		ifTrue: [
  			"end marking"
  			[phase = InMarkingPhase]
  				whileTrue: [self doIncrementalCollect]].
  		
  		"end this collection cycle"
  		[phase ~= InMarkingPhase]
  			whileTrue: [self doIncrementalCollect].
  			
  		"resolve forwarders in young space"
  		coInterpreter setGCMode: GCModeNewSpace.
  		self doScavengeWithoutIncrementalCollect: MarkOnTenure.
  		
+ 		coInterpreter setGCMode: GCModeIncremental.
+ 		
  		"mark completely"
  		[phase = InMarkingPhase]
  			whileTrue: [self doIncrementalCollect].
  		"do rest of collection"
  		[phase ~= InMarkingPhase]
  			whileTrue: [self doIncrementalCollect].
  	
  	manager setHeapSizeAtPreviousGC.
  	
  	coInterpreter cr; print: 'end fullGC '; tab; flush.
  	
  	^(manager freeLists at: 0) ~= 0
  		ifTrue: [manager bytesInBody: manager findLargestFreeChunk]
  		ifFalse: [0]!

Item was added:
+ ----- Method: SpurIncrementalGarbageCollector>>markObjects: (in category 'as yet unclassified') -----
+ markObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
+ 
+ 	self flag: #Todo. "write more efficient version"
+ 
+ 	coInterpreter setGCMode: GCModeNewSpace.
+ 	self doScavengeWithoutIncrementalCollect: MarkOnTenure.
+ 	
+ 	phase = InMarkingPhase
+ 		ifTrue: [
+ 			"end marking"
+ 			[phase = InMarkingPhase]
+ 				whileTrue: [self doIncrementalCollect]].
+ 		
+ 		"end this collection cycle"
+ 		[phase ~= InMarkingPhase]
+ 			whileTrue: [self doIncrementalCollect].
+ 			
+ 		"resolve forwarders in young space"
+ 		coInterpreter setGCMode: GCModeNewSpace.
+ 		self doScavengeWithoutIncrementalCollect: MarkOnTenure.
+ 		
+ 		"mark completely"
+ 		[phase = InMarkingPhase]
+ 			whileTrue: [self doIncrementalCollect].!

Item was added:
+ ----- Method: SpurIncrementalGarbageCollector>>maybeModifyCopiedObject: (in category 'object creation barriers') -----
+ maybeModifyCopiedObject: objOop
+ 
+ 	"1. when marking always mark as we already could have marked all objects pointing to objOop 
+ 	 2. during sweeping mark objects behind the sweepers current position so it does not collect objOop"
+ 
+ 	(manager isOldObject: objOop)
+ 		ifTrue: [			
+ 			phase = InMarkingPhase
+ 				ifTrue: [marker markAndShouldScan: objOop].
+ 				
+ 			(phase = InSweepingPhase and: [objOop >= compactor currentSweepingEntity])
+ 				ifTrue: [manager setIsMarkedOf: objOop to: true]]!

Item was changed:
  ----- Method: SpurIncrementalGarbageCollector>>maybeModifyForwarder: (in category 'object creation barriers') -----
  maybeModifyForwarder: objOop
  
  	"mark forwarders so they do not get garbage collected before they can get resolved. 
  	1. Does only apply to marking because only in this phase we can overlook forwarding references to be resolved (e.g. when 
+ 	the mutator runs after the first marking pass and an object that is referenced by at least one already black object gets a forwarded -> the pointer of the black object won't get updated in this marking pass and during sweeping the forwarding pointer will get removed).
- 	the mutator runs after the first marking pass and an already black object gets a forwarding pointer -> it will not be followed in this
- 	marking pass and during sweeping the forwardgin pointer will get removes).
  	2. Does not apply to sweeping or compacting because the forwarder is set on the header of the original object, which already includes 
  	the correcty set mark bit"
  	self assert: (manager isForwarded: objOop).
  	((manager isOldObject: objOop) and: [phase = InMarkingPhase])
  		ifTrue: [manager setIsMarkedOf: objOop to: true]!

Item was added:
+ ----- Method: SpurIncrementalMarker>>allReferencedClassesAreMarked (in category 'testing') -----
+ allReferencedClassesAreMarked
+ 
+ 	manager allObjectsDo: [:obj |
+ 		((manager isMarked: obj) or: [(manager isNonImmediate: obj) and: [manager isYoung: obj]])
+ 			ifTrue: [ | unmarkedClass |
+ 				unmarkedClass := self allReferencedClassesAreMarkedFrom: obj lastObj: -1.
+ 		
+ 				unmarkedClass ~= -1
+ 					ifTrue: [coInterpreter cr; print: 'class not marked '; tab; flush.
+ 						coInterpreter longPrintOop: unmarkedClass.
+ 						coInterpreter cr; print: 'referenced by: '; tab; flush.
+ 						coInterpreter longPrintOop: obj.
+ 						
+ 						(manager isYoung: obj)
+ 							ifTrue: [coInterpreter cr; print: 'young '; tab; flush.]
+ 							ifFalse: [coInterpreter cr; print: 'old'; tab; flush.].
+ 						^ false]]].
+ 		
+ 	^ true!

Item was added:
+ ----- Method: SpurIncrementalMarker>>allReferencedClassesAreMarkedFrom:lastObj: (in category 'testing') -----
+ allReferencedClassesAreMarkedFrom: objOop lastObj: lastObj
+ 
+ 	| classIndex classObj |
+ 	classIndex := manager classIndexOf: objOop.
+ 	classObj := manager classOrNilAtIndex: classIndex.
+ 	
+ 	(manager isMarked: classObj)
+ 		ifFalse: [^ classObj].
+ 		
+ 	^ lastObj = classObj
+ 		ifTrue: [-1]
+ 		ifFalse: [self allReferencedClassesAreMarkedFrom: classObj lastObj: objOop]!

Item was added:
+ ----- Method: SpurIncrementalMarker>>allReferencedClassesAreMarkedOrGreyFrom:lastObj: (in category 'testing') -----
+ allReferencedClassesAreMarkedOrGreyFrom: objOop lastObj: lastObj
+ 
+ 	| classIndex classObj |
+ 	classIndex := manager classIndexOf: objOop.
+ 	classObj := manager classOrNilAtIndex: classIndex.
+ 	
+ 	(manager isWhite: classObj)
+ 		ifFalse: [^ classObj].
+ 		
+ 	^ lastObj = classObj
+ 		ifTrue: [-1]
+ 		ifFalse: [self allReferencedClassesAreMarkedOrGreyFrom: classObj lastObj: objOop]!

Item was changed:
  ----- Method: SpurIncrementalMarker>>completeMark (in category 'marking - global') -----
  completeMark
  	"finishes the current mark pass"
  
  	| currentObj slotsLeft |
  	"manager objStack: manager markStack do: [:index :page | Transcript showln: (manager fetchPointer: index ofObject: page)].
  	manager sizeOfObjStack: manager markStack"
  	currentObj := manager popObjStack: manager markStack.
  	currentObj
  		ifNil: [^ true]. "there is nothing more on the stack and we are done"
  		
  	slotsLeft := SlotLimitPerPass.
  	
  	[
  		| slotNumber slotsToVisit startIndex |
  		
  		"after passing the limit we push the current index on the stack. Is the currentObj only an index? "
  		(manager isImmediate: currentObj)
  			ifTrue: [startIndex := currentObj.
  				currentObj := manager popObjStack: manager markStack.]
  			ifFalse: [startIndex := 0].
  			
  		self markAndTraceClassOf: currentObj.
+ 		(self allReferencedClassesAreMarkedOrGreyFrom: currentObj lastObj: -1) ~= -1
+ 			ifFalse: [self cCode: 'raise(SIGINT)'].
  			
  		slotNumber := manager numStrongSlotsOfInephemeral: currentObj.
  		slotsToVisit := slotNumber - startIndex.
  		
  		"we can mark all"
  		slotsLeft := slotsLeft - slotsToVisit.
  		
  		self markFrom: startIndex nSlots: slotsToVisit of: currentObj.		
  
  		"we finished everything there is to be done with to obj -> make it black"
  		self blackenObject: currentObj.
  		currentObj := manager popObjStack: manager markStack.
  	"repeat while there still are objects"
  	currentObj notNil] whileTrue.
  
  	^ true!

Item was changed:
  ----- Method: SpurIncrementalMarker>>finishMarking (in category 'as yet unclassified') -----
  finishMarking
  	"marks the structures needed during GC"
  	<inline: #never>
  	
- 	1 to: manager numClassTablePages - 1 do:
- 		[:i| manager setIsMarkedOf: (manager fetchPointer: i ofObject: manager hiddenRootsObj)
- 				to: true].
- 			
- 	self flag: #Todo. "handle ephemerons"
- 	
  	"lets assume there are not too many for now"
  	self markWeaklingsAndMarkAndFireEphemerons.
+ 	"self assert: self allReferencedClassesAreMarked."
+ 	self allReferencedClassesAreMarked not
+ 		ifTrue: [self cCode: 'raise(SIGINT)'].
+ 	manager expungeDuplicateAndUnmarkedClasses: true ignoringClassesInYoungSpace: true.
+ 	manager nilUnmarkedWeaklingSlotsExcludingYoungObjects: true.
- 	manager expungeDuplicateAndUnmarkedClasses: true.
- 	manager nilUnmarkedWeaklingSlots.
  	
  	self assert: (manager isEmptyObjStack: manager markStack).
  			
  	isCurrentlyMarking := false.
  	marking := false!

Item was changed:
  ----- Method: SpurIncrementalMarker>>incrementalMark (in category 'marking - incremental') -----
  incrementalMark
  	"does one marking cycle. Breaks after a certain amount of slots is marked and the last object, that amount is crossed in, is completely scanned"
  
  	| currentObj slotsLeft |
  	"manager objStack: manager markStack do: [:index :page | Transcript showln: (manager fetchPointer: index ofObject: page)].
  	manager sizeOfObjStack: manager markStack"
  	currentObj := manager popObjStack: manager markStack.
  	"skip young objects. They get already scanned as they are part of the roots"
  	[(currentObj notNil) and: [(manager isNonImmediate: currentObj) and: [manager isYoung: currentObj]]]
+ 			whileTrue: [(manager isInClassTable: currentObj) ifTrue: [manager setIsMarkedOf: currentObj to: true].
+ 				currentObj := manager popObjStack: manager markStack].
- 			whileTrue: [currentObj := manager popObjStack: manager markStack.].
  	
  	currentObj
  		ifNil: [^ true]. "there is nothing more on the stack and we are done"
  		
  	slotsLeft := SlotLimitPerPass.
  	
  	[
  		| slotNumber slotsToVisit startIndex |
  		
  		"after passing the limit we push the current index on the stack. Is the currentObj only an index? "
  		(manager isImmediate: currentObj)
  			ifTrue: [startIndex := manager integerValueOf: currentObj.
  				currentObj := manager popObjStack: manager markStack.]
  			ifFalse: [startIndex := 0.
  				
  				 self markAndTraceClassOf: currentObj.
+ 				(self allReferencedClassesAreMarkedOrGreyFrom: currentObj lastObj: -1) ~= -1
+ 			ifFalse: [self cCode: 'raise(SIGINT)'].
  				
  				"eager color the object black. Either it will get scanned completely and the color is correct
  				or we have at least scanned some of the slots. In the second case the mutator could 
  				modify one of the slots of the object that already were scanned and we would could lose
  				this object. Therefore color the object early to trigger the write barrier on writes. There will
  				be some overhead (trigger the barrier always although only the already scanned slots are
  				technically black) but it seems we need to do this for correctness"
  				self blackenObject: currentObj].
  			
  		slotNumber := manager numStrongSlotsOfInephemeral: currentObj.
  		slotsToVisit := slotNumber - startIndex.
  		
  		slotsLeft - slotsToVisit < 0
  			ifTrue: [
  				self 
  					markFrom: startIndex
  					nSlots: slotsLeft
  					of: currentObj.
  						
  				"If we need to abort earlier we push the index and the currently scanned object on the marking stack. Otherwise it is not possible
  				for immediates to be on the stack (they have no fields to be scanned) -> we can use the immediated to detect this pattern"
  				(manager topOfObjStack: manager markStack) ~= currentObj ifTrue: 
  						[manager push: currentObj onObjStack: manager markStack].
  				manager push: (manager integerObjectOf: slotsLeft) onObjStack: manager markStack.
  				
  				"we need to abort early to not run into some extreme corner cases (giant objects) that would explode our mark time assumptions"
  				^ false]
  			ifFalse: ["we can mark all"
  				slotsLeft := slotsLeft - slotsToVisit.
  				
  				self markFrom: startIndex nSlots: slotsToVisit of: currentObj].		
  
  		currentObj := manager popObjStack: manager markStack.
  		
  		[(currentObj notNil) and: [(manager isNonImmediate: currentObj) and: [manager isYoung: currentObj]]]
+ 			whileTrue: [(manager isInClassTable: currentObj) ifTrue: [manager setIsMarkedOf: currentObj to: true].
+ 				currentObj := manager popObjStack: manager markStack].
- 			whileTrue: [currentObj := manager popObjStack: manager markStack.].
  	"repeat while there still are objects"
  	currentObj notNil] whileTrue.
  
  	^ true!

Item was changed:
  ----- Method: SpurIncrementalMarker>>incrementalMarkFrom: (in category 'marking - incremental') -----
  incrementalMarkFrom: objOop
  	"does one marking cycle. Breaks after a certain amount of slots is marked and the last object, that amount is crossed in, is completely scanned"
  
  	| currentObj slotsLeft |
  	"manager objStack: manager markStack do: [:index :page | Transcript showln: (manager fetchPointer: index ofObject: page)].
  	manager sizeOfObjStack: manager markStack"
  	currentObj := objOop.
  	currentObj
  		ifNil: [^ true]. "there is nothing more on the stack and we are done"
  		
  	slotsLeft := SlotLimitPerPass.
  	
  	[
  		| slotNumber slotsToVisit startIndex |
  		
  		"after passing the limit we push the current index on the stack. Is the currentObj only an index? "
  		(manager isImmediate: currentObj)
  			ifTrue: [startIndex := manager integerValueOf: currentObj.
  				currentObj := manager popObjStack: manager markStack.]
  			ifFalse: [startIndex := 0].
  			
+ 		((manager isYoung: currentObj) and: [manager isInClassTable: currentObj])
+ 			ifTrue: [manager setIsMarkedOf: currentObj to: true].
+ 			
  		slotNumber := manager numStrongSlotsOfInephemeral: currentObj.
  		slotsToVisit := slotNumber - startIndex.
  		
  		slotsLeft - slotsToVisit < 0
  			ifTrue: [
  				self 
  					markFrom: startIndex
  					nSlots: slotsLeft
  					of: currentObj.
  						
  				"If we need to abort earlier we push the index and the currently scanned object on the marking stack. Otherwise it is not possible
  				for immediates to be on the stack (they have no fields to be scanned) -> we can use the immediated to detect this pattern"
  				(manager topOfObjStack: manager markStack) ~= currentObj ifTrue: 
  						[manager push: currentObj onObjStack: manager markStack].
  				manager push: (manager integerObjectOf: slotsLeft + 1) onObjStack: manager markStack.
  				
  				"we need to abort early to not run into some extreme corner cases (giant objects) that would explode our mark time assumptions"
  				^ false]
  			ifFalse: ["we can mark all"
  				slotsLeft := slotsLeft - slotsToVisit.
  				
  				self markFrom: startIndex nSlots: slotsToVisit of: currentObj].		
  
  		"we finished everything there is to be done with to obj -> make it black"
  		self blackenObject: currentObj.
  		currentObj := manager popObjStack: manager markStack.
  	"repeat while there still are objects"
  	currentObj notNil] whileTrue.
  
  	^ true!

Item was changed:
  ----- Method: SpurIncrementalMarker>>incrementalMarkObjects (in category 'marking - incremental') -----
  incrementalMarkObjects
  	"this method is to be run directly after a scavenge -> we can assume there are ony objects in the now past survivor space"
  
  	<inline: #never> "for profiling"
  	
  	"manager runLeakCheckerFor: GCModeIncremental."
  	
  	self initForNewMarkingPassIfNecessary.
  
  	[ | continueMarking |
  	(manager isEmptyObjStack: manager markStack)
  		ifTrue: [self pushAllRootsOnMarkStack.
  			" manager sizeOfObjStack: manager markStack.
  			did we finish marking?"
  			(manager isEmptyObjStack: manager markStack)
  				ifTrue: [self finishMarking.
  					^ true]].
  	
  	
  	"due to a slang limitations we have to assign the result into variable => do not remove!!"
  	continueMarking := self incrementalMark.
  	continueMarking] whileTrue.
  
  	^ false
  	!

Item was changed:
  ----- Method: SpurIncrementalMarker>>initForNewMarkingPassIfNecessary (in category 'marking-initialization') -----
  initForNewMarkingPassIfNecessary
  
  	isCurrentlyMarking 
  		ifFalse: [
  			manager initializeMarkStack.
  			manager initializeWeaklingStack.
  			manager initializeEphemeronStack.
  			
  			"This must come first to enable stack page reclamation.  It clears
  			  the trace flags on stack pages and so must precede any marking.
  			  Otherwise it will clear the trace flags of reached pages."
  			coInterpreter initStackPageGC.
  			
+ 			self markHelperStructures.
+ 			
+ 			isCurrentlyMarking := true.
+ 			marking := true.
+ 			self pushInternalStructuresOnMarkStack].
- 			self markHelperStructures].
  		
+ 	!
- 	isCurrentlyMarking := true.
- 	marking := true!

Item was removed:
- ----- Method: SpurIncrementalMarker>>isLeafInObjectGraph: (in category 'barrier') -----
- isLeafInObjectGraph: anObject
- 	
- 	^ (manager isImmediate: anObject)!

Item was changed:
  ----- Method: SpurIncrementalMarker>>markAndShouldScan: (in category 'marking - incremental') -----
  markAndShouldScan: objOop
  	"marks the object (grey or black as neccessary) and returns if the object should be scanned
  	Objects that get handled later on get marked as black, as they are practically a leaf in the object tree (we scan them later on, so we cannot lose objects and do not
  	need to adhere to the tricolor invariant)"
  
  	| format |
  	<inline: true>
  	(manager isYoung: objOop)
  		ifTrue: [^ false].
  	
  	(manager isImmediate: objOop) ifTrue:
  		[^false].
  	
  	self assert: (manager isForwarded: objOop) not.
  
  	"if it is marked we already did everything we needed to do and if is grey we already saw it and do not have to do anything here"
  	(manager isWhite: objOop) not ifTrue:
  		[^false].
  	
  	format := manager formatOf: objOop.
  	
  	(manager isPureBitsFormat: format) ifTrue: "avoid pushing non-pointer objects on the markStack."
  		["Avoid tracing classes of non-objects on the heap, e.g. IRC caches, Sista counters."
  		 (manager classIndexOf: objOop) > manager lastClassIndexPun ifTrue:
  			[self markAndTraceClassOf: objOop].
  		
  		"the object does not need to enter the marking stack as there are no pointer to visit -> it is already finished and we can make it black"
  		self blackenObject: objOop.
  		 ^false].
  	
  	(manager isWeakFormat: format) ifTrue: "push weaklings on the weakling stack to scan later"
  		[manager push: objOop onObjStack: manager weaklingStack.
  		"do not follow weak references. They get scanned at the end of marking -> it should be ok to not follow the tricolor invariant"
  		self blackenObject: objOop.
  		 ^false].
  	
  	((manager isEphemeronFormat: format)
  	 and: [manager activeAndDeferredScan: objOop]) ifTrue:
  		[self blackenObject: objOop.
  		^false].
  	
  	"we know it is an object that can contain we have to follow"
  	self pushOnMarkingStackAndMakeGrey: objOop.
  	
  	^ true!

Item was changed:
  ----- Method: SpurIncrementalMarker>>markAndTraceClassOf: (in category 'marking - incremental') -----
  markAndTraceClassOf: objOop
  	"Ensure the class of the argument is marked, pushing it on the markStack if not already marked.
  	 And for one-way become, which can create duplicate entries in the class table, make sure
  	 objOop's classIndex refers to the classObj's actual classIndex.
  	 Note that this is recursive, but the metaclass chain should terminate quickly."
  	<inline: false>
  	| classIndex classObj realClassIndex |
  	classIndex := manager classIndexOf: objOop.
  	classObj := manager classOrNilAtIndex: classIndex.
  	self assert: (coInterpreter objCouldBeClassObj: classObj).
  	realClassIndex := manager rawHashBitsOf: classObj.
+ 	realClassIndex = 16r2925 ifTrue: [self cCode: 'raise(SIGINT)'].
  	(classIndex ~= realClassIndex
  	 and: [classIndex > manager lastClassIndexPun]) ifTrue:
  		[manager setClassIndexOf: objOop to: realClassIndex].
  	(manager isWhite: classObj) ifTrue:
  		[self pushOnMarkingStackAndMakeGrey: classObj.
  		 self markAndTraceClassOf: classObj]!

Item was changed:
  ----- Method: SpurIncrementalMarker>>markAndTraceWeaklingsFrom: (in category 'weaklings and ephemerons') -----
  markAndTraceWeaklingsFrom: startIndex
  	"Mark weaklings on the weaklingStack, ignoring startIndex
  	 number of elements on the bottom of the stack.  Answer
  	 the size of the stack *before* the enumeration began."
  	^manager objStack: manager weaklingStack from: startIndex do:
  		[:weakling|
  		 self deny: (manager isForwarded: weakling).
  		self flag: #Todo. "this will probably just push a part on the marking stack. This will get resolved with the next mark loop"
  		 self markAndTraceClassOf: weakling.
+ 		(self allReferencedClassesAreMarkedOrGreyFrom: weakling lastObj: -1) ~= -1
+ 			ifFalse: [self cCode: 'raise(SIGINT)'].
  		
  		"N.B. generateToByDoLimitExpression:negative:on: guards against (unsigned)0 - 1 going +ve"
  		 0 to: (manager numStrongSlotsOfWeakling: weakling) - 1 do:
  			[:i| | field |
  			field := manager followOopField: i ofObject: weakling.
  			((manager isImmediate: field) or: [manager isMarked: field]) ifFalse:
  				[self incrementalMarkAndTrace: field]]]!

Item was changed:
  ----- Method: SpurIncrementalMarker>>markInactiveEphemerons (in category 'weaklings and ephemerons') -----
  markInactiveEphemerons
  	"Go through the unscanned ephemerons, marking the inactive ones, and
  	 removing them from the unscanned ephemerons. Answer if any inactive
  	 ones were found. We cannot fire the ephemerons until all are found to
  	 be active since scan-marking an inactive ephemeron later in the set may
  	 render a previously-observed active ephemeron as inactive."
  	| foundInactive ptr |
+ 	self break.
  	foundInactive := false.
  	ptr := manager unscannedEphemerons start.
  	[ptr < manager unscannedEphemerons top] whileTrue:
  		[| ephemeron key |
  		 key := manager followedKeyOfEphemeron: (ephemeron := manager longAt: ptr).
  		 ((manager isImmediate: key) or: [manager isMarked: key])
  			ifTrue:
  				[foundInactive := true.
  				 "Now remove the inactive ephemeron from the set, and scan-mark it.
  				  Scan-marking it may add more ephemerons to the set."
  				 manager unscannedEphemerons top: manager unscannedEphemerons top - manager bytesPerOop.
  				 manager unscannedEphemerons top > ptr ifTrue:
  					[manager longAt: ptr put: (manager longAt: manager unscannedEphemerons top)].
  				 self markAndTrace: ephemeron]
  			ifFalse:
  				[ptr := ptr + manager bytesPerOop]].
  	^foundInactive!

Item was removed:
- ----- Method: SpurIncrementalMarker>>markObjects: (in category 'as yet unclassified') -----
- markObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
- 
- 	"mark objects has to mark all available objects in on go (e.g. used in allInstances where you only want live objects so you do not keep objects that should
- 	be collected alive, as they get collected into an array that then holds a reference to the object)
- 	In the incremental case we need to throw away our marking progress until now :( (in incremental collection garbage can stay around until the next round
- 	of collection when we encountered an object during earlier stages of marking when an object is still alive, but it gets unreferenced during one of the 
- 	following mutator runs)"
- 	
- 	self flag: #Todo. "we made a (forced) complete marking. Lets use the results"
- 	self completeMarkObjects.
- 	self assert: (manager isEmptyObjStack: manager markStack).!

Item was added:
+ ----- Method: SpurIncrementalMarker>>markersMarkObjects: (in category 'as yet unclassified') -----
+ markersMarkObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
+ 
+ 	"mark objects has to mark all available objects in on go (e.g. used in allInstances where you only want live objects so you do not keep objects that should
+ 	be collected alive, as they get collected into an array that then holds a reference to the object)
+ 	In the incremental case we need to throw away our marking progress until now :( (in incremental collection garbage can stay around until the next round
+ 	of collection when we encountered an object during earlier stages of marking when an object is still alive, but it gets unreferenced during one of the 
+ 	following mutator runs)"
+ 	
+ 	self flag: #Todo. "we made a (forced) complete marking. Lets use the results"
+ 	self completeMarkObjects.
+ 	self assert: (manager isEmptyObjStack: manager markStack).!

Item was changed:
  ----- Method: SpurIncrementalMarker>>pushAllRootsOnMarkStack (in category 'root-scanning') -----
  pushAllRootsOnMarkStack
  	"Roots are:
  		1. references from the stack
  		2. references from the hidden roots
  		3. references from extra roots?
  		4. references from young space (it was recently scavenged -> only alive objects)"
  		
+ 	
+ 	self pushInternalStructuresOnMarkStack.
- 		
  	self pushStackReferencesOnMarkingStack.
- 	self pushHiddenRootsReferencesOnMarkingStack.
- 	self pushExtraRootsReferencesOnMarkingStack.
  	self pushNewSpaceReferencesOnMarkingStack.!

Item was changed:
  ----- Method: SpurIncrementalMarker>>pushHiddenRootsReferencesOnMarkingStack (in category 'root-scanning') -----
  pushHiddenRootsReferencesOnMarkingStack
  
+ 	| classTablePageSizeLocal |
- 	| classTablePageSize |
  	self markAndTraceObjStack: manager markStack andContents: false.
  	self markAndTraceObjStack: manager weaklingStack andContents: false.
  	self markAndTraceObjStack: manager mournQueue andContents: true.
  	self markAndTraceObjStack: manager ephemeronStack andContents: false.
  	
+ 	classTablePageSizeLocal := manager numStrongSlotsOfInephemeral: manager classTableFirstPage.
+ 	self markNSlots: classTablePageSizeLocal of: manager classTableFirstPage.
+ 	self blackenObject: manager classTableFirstPage.
+ 	
+ 	1 to: manager numClassTablePages - 1 do:
+ 		[:i| manager setIsMarkedOf: (manager fetchPointer: i ofObject: manager hiddenRootsObj)
+ 				to: true].!
- 	classTablePageSize := manager numStrongSlotsOfInephemeral: manager classTableFirstPage.
- 	self markNSlots: classTablePageSize of: manager classTableFirstPage.
- 	self blackenObject: manager classTableFirstPage!

Item was added:
+ ----- Method: SpurIncrementalMarker>>pushInternalStructuresOnMarkStack (in category 'root-scanning') -----
+ pushInternalStructuresOnMarkStack		
+ 		
+ 	self pushHiddenRootsReferencesOnMarkingStack.
+ 	self pushExtraRootsReferencesOnMarkingStack.!

Item was changed:
  ----- Method: SpurIncrementalMarker>>pushNewSpaceReferencesOnMarkingStack (in category 'root-scanning') -----
  pushNewSpaceReferencesOnMarkingStack
  
  	manager allNewSpaceObjectsDo: [:objOop | | format |
  		format := manager formatOf: objOop.
  		
+ 		self markAndTraceClassOf: objOop.
+ 		(self allReferencedClassesAreMarkedOrGreyFrom: objOop lastObj: -1) ~= -1
+ 			ifFalse: [self cCode: 'raise(SIGINT)'].
+ 		
  		"has the object pointers to visit?"
  		((manager isNonImmediate: objOop) and: [(manager isPureBitsFormat: format) not])
  			ifTrue: [ | slotNumber |
  				slotNumber := manager numStrongSlotsOfInephemeral: objOop.
  				
  				0 to: slotNumber - 1
  					do: [ :slotIndex | | slot |
  						slot := manager fetchPointer: slotIndex ofObject: objOop.
  							
  						(self shoudlBeOnMarkingStack: slot)
  							ifTrue: [self markAndShouldScan: slot]]]]
  				!

Item was changed:
  ----- Method: SpurIncrementalMarker>>pushOnMarkingStackAndMakeGreyIfNecessary: (in category 'marking-stack') -----
  pushOnMarkingStackAndMakeGreyIfNecessary: objOop
  
+ 	((manager isImmediate: objOop) and: [manager isWhite: objOop])
- 	(manager isImmediate: objOop)
- 		ifTrue: [^ self].
- 		
- 	(manager isWhite: objOop)
  		ifTrue: [self pushOnMarkingStackAndMakeGrey: objOop]!

Item was changed:
  ----- Method: SpurIncrementalMarker>>writeBarrierFor:at:with: (in category 'barrier') -----
  writeBarrierFor: anObject at: index with: value
  	"a dijkstra style write barrier with the addition of the generation check
  	objects that are not able to contain pointers are ignored too, as the write barries
  	should ensure we lose no references and this objects do not hold any of them"
  	<inline: true>
  	
  	self flag: #Todo. "we probably want the oldObject check to be the first one as it is only a pointer comparison and no dereferencing is needed"
+ 	
+ 	"((manager isImmediate: value) not and: [(manager isPureBitsNonImm: value)])
+ 		ifTrue: [coInterpreter cr; print: 'saw: '; printHexnp: value; tab; flush]."
+ 	
+ 	(self marking and: [(manager isImmediate: value) not and: [(manager isOldObject: anObject) and: [(manager isOldObject: value) and: [manager isMarked: anObject]]]])
+ 		ifTrue: [self markAndShouldScan: value]!
- 	(self marking and: [(self isLeafInObjectGraph: value) not  and: [(manager isOldObject: anObject) and: [(manager isOldObject: value) and: [manager isMarked: anObject]]]])
- 		ifTrue: [self pushOnMarkingStackAndMakeGreyIfNecessary: value]!

Item was changed:
  ----- Method: SpurIncrementalSweeper>>incrementalSweep (in category 'api - incremental') -----
  incrementalSweep
  	<inline: #never> "for profiling"
  	
  	self initIfNecessary.
  	
+ 	self assert: manager validObjectColors.
+ 	
  	self doIncrementalSweeping
  		ifTrue: [self finishSweeping.
  			^ true].
  		
  	^ false
  	!

Item was removed:
- ----- Method: SpurMarker>>markObjects: (in category 'as yet unclassified') -----
- markObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
- 
- 	self shouldBeImplemented!

Item was added:
+ ----- Method: SpurMarker>>markersMarkObjects: (in category 'as yet unclassified') -----
+ markersMarkObjects: objectsShouldBeUnmarkedAndUnmarkedClassesShouldBeExpunged
+ 
+ 	self shouldBeImplemented!

Item was changed:
  ----- Method: SpurMemoryManager>>addFreeChunkWithBytes:at: (in category 'free space') -----
  addFreeChunkWithBytes: bytes at: address
  
+ 	<var: 'bytes' type: #'usqInt'>
- 	<var: 'aCString' type: #'usqInt'>
  	totalFreeOldSpace := totalFreeOldSpace + bytes.
  	^self freeChunkWithBytes: bytes at: address!

Item was changed:
  ----- Method: SpurMemoryManager>>allInstancesOf: (in category 'primitive support') -----
  allInstancesOf: aClass
  	"Attempt to answer an array of all objects, excluding those that may
  	 be garbage collected as a side effect of allocating the result array.
  	 If no memory is available answer the number of instances as a SmallInteger.
  	 Since objects are at least 16 bytes big, and the largest SmallInteger covers
  	 1/4 of the address space, the count can never overflow."
  	| classIndex freeChunk ptr start limit count bytes |
  	classIndex := self rawHashBitsOf: aClass.
  	classIndex = 0 ifTrue:
  		[freeChunk := self allocateSlots: 0 format: self arrayFormat classIndex: ClassArrayCompactIndex.
  		 ^freeChunk].
  	gc markObjectsForEnumerationPrimitives ifTrue:
+ 		[gc markObjects: true]. "may not want to revive objects unnecessarily; but marking is sloooow."
- 		[marker markObjects: true]. "may not want to revive objects unnecessarily; but marking is sloooow."
  	freeChunk := self allocateLargestFreeChunk. "N.B. Does /not/ update totalFreeOldSpace"
  	start := freeChunk + self baseHeaderSize.
  	limit := self addressAfter: freeChunk.
  	(self isClassAtUniqueIndex: aClass)
  		ifTrue:
  			[self uniqueIndex: classIndex allInstancesInto: start limit: limit resultsInto: [:c :p| count := c. ptr := p]]
  		ifFalse:
  			[self ambiguousClass: aClass allInstancesInto: start limit: limit resultsInto: [:c :p| count := c. ptr := p]].
  	self assert: (self isEmptyObjStack: markStack).
  	gc markObjectsForEnumerationPrimitives
  		ifTrue:
  			[self assert: self allObjectsUnmarked.
  			 self emptyObjStack: weaklingStack]
  		ifFalse:
  			[self assert: (self isEmptyObjStack: weaklingStack)].
  	(count > (ptr - start / self bytesPerOop) "not enough room"
  	 or: [limit ~= ptr and: [limit - ptr <= self allocationUnit]]) ifTrue: "can't split a single word"
  		[self freeObject: freeChunk.
  		 ^self integerObjectOf: count].
  	count < self numSlotsMask ifTrue:
  		[| smallObj |
  		 smallObj := self allocateSlots: count format: self arrayFormat classIndex: ClassArrayCompactIndex.
  		 0 to: count - 1 do:
  			[:i|
  			self storePointerUnchecked: i ofObject: smallObj withValue: (self fetchPointer: i ofFreeChunk: freeChunk)].
  		 self freeChunkWithBytes: (self bytesInBody: freeChunk) at: (self startOfObject: freeChunk).
  		 self beRootIfOld: smallObj.
  		 self checkFreeSpace: GCModeFull.
  		 ^smallObj].
  	bytes := self largeObjectBytesForSlots: count.
  	start := self startOfObject: freeChunk.
  	self freeChunkWithBytes: limit - start - bytes at: start + bytes.
  	totalFreeOldSpace := totalFreeOldSpace - bytes.
  	self rawOverflowSlotsOf: freeChunk put: count.
  	self set: freeChunk classIndexTo: ClassArrayCompactIndex formatTo: self arrayFormat.
  	gc maybeModifyGCFlagsOf: freeChunk.
  	self possibleRootStoreInto: freeChunk.
  	self checkFreeSpace: GCModeFull.
  	self runLeakCheckerFor: GCModeFull.
  	^freeChunk!

Item was changed:
  ----- Method: SpurMemoryManager>>allObjects (in category 'primitive support') -----
  allObjects
  	"Attempt to answer an array of all objects, excluding those that may
  	 be garbage collected as a side effect of allocating the result array.
  	 If no memory is available answer the number of objects as a SmallInteger.
  	 Since objects are at least 16 bytes big, and the largest SmallInteger covers
  	 1/4 of the address space, the count can never overflow."
  	| freeChunk ptr start limit count bytes |
  	gc markObjectsForEnumerationPrimitives ifTrue:
+ 		[gc markObjects: true]. "may not want to revive objects unnecessarily; but marking is sloooow."
- 		[marker markObjects: true]. "may not want to revive objects unnecessarily; but marking is sloooow."
  	freeChunk := self allocateLargestFreeChunk. "N.B. Does /not/ update totalFreeOldSpace"
  	ptr := start := freeChunk + self baseHeaderSize.
  	limit := self addressAfter: freeChunk.
  	count := 0.
  	self allHeapEntitiesDo:
  		[:obj| "continue enumerating even if no room so as to unmark all objects."
  		 (gc markObjectsForEnumerationPrimitives
  				ifTrue: [self isMarked: obj]
  				ifFalse: [true]) ifTrue:
  			[(self isNormalObject: obj)
  				ifTrue:
  					[gc markObjectsForEnumerationPrimitives ifTrue:
  						[self setIsMarkedOf: obj to: false].
  					 count := count + 1.
  					 ptr < limit ifTrue:
  						[self longAt: ptr put: obj.
  						 ptr := ptr + self bytesPerOop]]
  				ifFalse:
  					[gc markObjectsForEnumerationPrimitives ifTrue:
  						[(self isSegmentBridge: obj) ifFalse:
  							[self setIsMarkedOf: obj to: false]]]]].
  	self assert: (self isEmptyObjStack: markStack).
  	gc markObjectsForEnumerationPrimitives
  		ifTrue:
  			[self assert: self allObjectsUnmarked.
  			 self emptyObjStack: weaklingStack]
  		ifFalse:
  			[self assert: (self isEmptyObjStack: weaklingStack)].
  	self assert: count >= self numSlotsMask.
  	(count > (ptr - start / self bytesPerOop) "not enough room"
  	 or: [limit ~= ptr and: [limit - ptr <= self allocationUnit]]) ifTrue: "can't split a single word"
  		[self freeChunkWithBytes: (self bytesInBody: freeChunk) at: (self startOfObject: freeChunk).
  		 self checkFreeSpace: GCModeFull.
  		 ^self integerObjectOf: count].
  	bytes := self largeObjectBytesForSlots: count.
  	start := self startOfObject: freeChunk.
  	self freeChunkWithBytes: limit - start - bytes at: start + bytes.
  	totalFreeOldSpace := totalFreeOldSpace - bytes.
  	self rawOverflowSlotsOf: freeChunk put: count.
  	self set: freeChunk classIndexTo: ClassArrayCompactIndex formatTo: self arrayFormat.
  	gc maybeModifyGCFlagsOf: freeChunk.
  	self possibleRootStoreInto: freeChunk.
  	self checkFreeSpace: GCModeFull.
  	self runLeakCheckerFor: GCModeFull.
  	^freeChunk!

Item was removed:
- ----- Method: SpurMemoryManager>>allStrongSlotsOfWeaklingAreMarked: (in category 'weakness and ephemerality') -----
- allStrongSlotsOfWeaklingAreMarked: aWeakling
- 	"N.B. generateToByDoLimitExpression:negative:on: guards against (unsigned)0 - 1 going +ve"
- 	0 to: (self numStrongSlotsOfWeakling: aWeakling) - 1 do:
- 		[:i| | referent |
- 		referent := self fetchPointer: i ofObject: aWeakling.
- 		(self isNonImmediate: referent) ifTrue:
- 			[(self isMarked: referent) ifFalse:
- 				[^false]]].
- 	^true!

Item was added:
+ ----- Method: SpurMemoryManager>>allStrongSlotsOfWeaklingAreMarked:excludingYoungObjects: (in category 'weakness and ephemerality') -----
+ allStrongSlotsOfWeaklingAreMarked: aWeakling excludingYoungObjects: aBoolean
+ 	"N.B. generateToByDoLimitExpression:negative:on: guards against (unsigned)0 - 1 going +ve"
+ 	0 to: (self numStrongSlotsOfWeakling: aWeakling) - 1 do:
+ 		[:i| | referent |
+ 		referent := self fetchPointer: i ofObject: aWeakling.
+ 		(self isNonImmediate: referent) ifTrue:
+ 			[(self isMarked: referent) ifFalse:
+ 				[((self isYoung: referent) and: [aBoolean] )
+ 					ifFalse: [self cCode: 'raise(SIGINT);'. ^false]  ]]].
+ 	^true!

Item was changed:
  ----- Method: SpurMemoryManager>>checkHeapFreeSpaceIntegrity (in category 'debug support') -----
  checkHeapFreeSpaceIntegrity
  	"Perform an integrity/leak check using the heapMap.  Assume clearLeakMapAndMapAccessibleFreeSpace
  	 has set a bit at each free chunk's header.  Scan all objects in the heap checking that no pointer points
  	 to a free chunk and that all free chunks that refer to others refer to marked chunks.  Answer if all checks pass."
  	| ok total |
  	<inline: false>
  	<var: 'total' type: #usqInt>
  	ok := true.
  	total := 0.
  	0 to: self numFreeLists - 1 do:
  		[:i|
  		(freeLists at: i) ~= 0 ifTrue:
  			[(heapMap heapMapAtWord: (self pointerForOop: (freeLists at: i))) = 0 ifTrue:
  				[coInterpreter print: 'leak in free list '; printNum: i; print: ' to non-free '; printHex: (freeLists at: i); eekcr.
  				 ok := false]]].
  
  	"Excuse the duplication but performance is at a premium and we avoid
  	 some tests by splitting the newSpace and oldSpace enumerations."
  	self allNewSpaceEntitiesDo:
  		[:obj| | fieldOop |
  		 (self isFreeObject: obj)
  			ifTrue:
  				[coInterpreter print: 'young object '; printHex: obj; print: ' is free'; eekcr.
- 				coInterpreter longPrintOop: obj.
  				 ok := false]
  			ifFalse:
  				[obj ~= freeSpaceCheckOopToIgnore ifTrue:
  					[0 to: (self numPointerSlotsOf: obj) - 1 do:
  						[:fi|
  						 fieldOop := self fetchPointer: fi ofObject: obj.
  						 (self isNonImmediate: fieldOop) ifTrue:
  							[(heapMap heapMapAtWord: (self pointerForOop: fieldOop)) ~= 0 ifTrue:
  								[coInterpreter print: 'object leak in '; printHex: obj; print: ' @ '; printNum: fi; print: ' = '; printHex: fieldOop; print: ' is free'; eekcr.
- 								coInterpreter longPrintOop: obj.
  								 ok := false]]]]]].
  	self allOldSpaceEntitiesDo:
  		[:obj| | fieldOop |
  		(self isFreeObject: obj)
  			ifTrue:
  				[
  				(compactor compactor segmentToFill isNil or: [(self objectStartingAt: (compactor compactor segmentToFill segStart)) ~= obj])
  					ifTrue: [
  						(heapMap heapMapAtWord: (self pointerForOop: obj)) = 0 ifTrue:
  						[coInterpreter print: 'leak in free chunk '; printHex: obj; print: ' is unmapped?!! '; eekcr.
- 						coInterpreter longPrintOop: obj.
  						 ok := false].
  					 fieldOop := self fetchPointer: self freeChunkNextIndex ofFreeChunk: obj.
  					 (fieldOop ~= 0
  					 and: [(heapMap heapMapAtWord: (self pointerForOop: fieldOop)) = 0]) ifTrue:
  						[coInterpreter print: 'leak in free chunk '; printHex: obj; print: ' @ 0 = '; printHex: fieldOop; print: ' is unmapped'; eekcr.
- 						coInterpreter longPrintOop: obj.
  						 ok := false].
  					(self isLilliputianSize: (self bytesInBody: obj)) ifFalse:
  						[fieldOop := self fetchPointer: self freeChunkPrevIndex ofFreeChunk: obj.
  						 (fieldOop ~= 0
  						 and: [(heapMap heapMapAtWord: (self pointerForOop: fieldOop)) = 0]) ifTrue:
  							[coInterpreter print: 'leak in free chunk '; printHex: obj; print: ' @ 0 = '; printHex: fieldOop; print: ' is unmapped'; eekcr.
- 							coInterpreter longPrintOop: obj.
  							 ok := false]].
  					(self isLargeFreeObject: obj) ifTrue:
  						[self freeChunkParentIndex to: self freeChunkLargerIndex do:
  							[:fi|
  							 fieldOop := self fetchPointer: fi ofFreeChunk: obj.
  							 (fieldOop ~= 0
  							 and: [(heapMap heapMapAtWord: (self pointerForOop: fieldOop)) = 0]) ifTrue:
  								[coInterpreter print: 'leak in free chunk '; printHex: obj; print: ' @ '; printNum: fi; print: ' = '; printHex: fieldOop; print: ' is unmapped'; eekcr.
- 								coInterpreter longPrintOop: obj.
  								 ok := false]]].
  					total := total + (self bytesInBody: obj)]]
  				
  			ifFalse:
  				[obj ~= freeSpaceCheckOopToIgnore ifTrue:
  					[0 to: (self numPointerSlotsOf: obj) - 1 do:
  						[:fi|
  						 (self isForwarded: obj)
  							ifTrue: 
  								[self assert: fi = 0. "I'm now trying to use forwarders in GC algorithms..."
  								 fieldOop := self fetchPointer: fi ofMaybeForwardedObject: obj] 
  							ifFalse: "We keep #fetchPointer:ofObject: API here for assertions"
  								[fieldOop := self fetchPointer: fi ofObject: obj].
  						 (self isNonImmediate: fieldOop) ifTrue:
  							[(heapMap heapMapAtWord: (self pointerForOop: fieldOop)) ~= 0 ifTrue:
  								[coInterpreter print: 'object leak in '; printHex: obj; print: ' @ '; printNum: fi; print: ' = '; printHex: fieldOop; print: ' is free'; eekcr.
- 								coInterpreter longPrintOop: obj.
  								 ok := false]]]]]].
  		
  	total - totalFreeOldSpace ~= 0 ifTrue:
  		[coInterpreter print: 'incorrect totalFreeOldSpace; expected '; printNum: totalFreeOldSpace; print: ' found '; printNum: total; eekcr.
  		 ok := false].
  	^ok!

Item was changed:
  ----- Method: SpurMemoryManager>>checkHeapIntegrity:classIndicesShouldBeValid: (in category 'debug support') -----
(excessive size, no diff calculated)

Item was changed:
+ ----- Method: SpurMemoryManager>>clearLeakMapAndMapAccessibleFreeSpace (in category 'debug support - leak/mark map') -----
- ----- Method: SpurMemoryManager>>clearLeakMapAndMapAccessibleFreeSpace (in category 'debug support') -----
  clearLeakMapAndMapAccessibleFreeSpace
  	"Perform an integrity/leak check using the heapMap.  Set a bit at each free chunk's header."
  	<inline: false>
  	heapMap clearHeapMap.
  	self allOldSpaceEntitiesFrom: self firstObject
  		do: [:objOop|
  			(self isFreeObject: objOop) ifTrue:
  				[heapMap heapMapAtWord: (self pointerForOop: objOop) Put: 1]]!

Item was changed:
+ ----- Method: SpurMemoryManager>>clearLeakMapAndMapAccessibleObjects (in category 'debug support - leak/mark map') -----
- ----- Method: SpurMemoryManager>>clearLeakMapAndMapAccessibleObjects (in category 'debug support') -----
  clearLeakMapAndMapAccessibleObjects
  	"Perform an integrity/leak check using the heapMap.  Set a bit at each object's header."
  	<inline: false>
  	heapMap clearHeapMap.
  	self allObjectsDo:
  		[:oop| heapMap heapMapAtWord: (self pointerForOop: oop) Put: 1]!

Item was changed:
  ----- Method: SpurMemoryManager>>debugGCCollect (in category 'debug support') -----
  debugGCCollect
  	"empties the heap for writing "
  
  	coInterpreter preGCAction: GCModeFull.
  	self flushNewSpace.
+ 	gc markObjects: true.
- 	marker markObjects: true.
  	scavenger forgetUnmarkedRememberedObjects.
  	segmentManager prepareForGlobalSweep.
  	compactor compact.
  	coInterpreter postGCAction: GCModeFull.
  	
  	self assert: self validObjStacks.
  	self assert: (self isEmptyObjStack: markStack).
  	self assert: (self isEmptyObjStack: weaklingStack).
  	self assert: self allObjectsUnmarked.!

Item was added:
+ ----- Method: SpurMemoryManager>>expungeDuplicateAndUnmarkedClasses:ignoringClassesInYoungSpace: (in category 'class table') -----
+ expungeDuplicateAndUnmarkedClasses: expungeUnmarked ignoringClassesInYoungSpace: ignoreYounglings
+ 	"Bits have been set in the classTableBitmap corresponding to
+ 	 used classes.  Any class in the class table that does not have a
+ 	 bit set has no instances with that class index.  However, becomeForward:
+ 	 can create duplicate entries, and these duplicate entries wont match their
+ 	 identityHash. So expunge duplicates by eliminating unmarked entries that
+ 	 don't occur at their identityHash."
+ 	1 to: numClassTablePages - 1 do: "Avoid expunging the puns by not scanning the 0th page."
+ 		[:i| | classTablePage |
+ 		classTablePage := self fetchPointer: i ofObject: hiddenRootsObj.
+ 		 0 to: self classTablePageSize - 1 do:
+ 			[:j| | classOrNil classIndex |
+ 			 classOrNil := self fetchPointer: j ofObject: classTablePage.
+ 			 classIndex := i << self classTableMajorIndexShift + j.
+ 			 self assert: (classOrNil = nilObj or: [coInterpreter addressCouldBeClassObj: classOrNil]).
+ 			 "only remove a class if it is at a duplicate entry or it is unmarked and we're expunging unmarked classes."
+ 			 classOrNil = nilObj
+ 				ifTrue:
+ 					[classIndex < classTableIndex ifTrue:
+ 						[classTableIndex := classIndex]]
+ 				ifFalse:
+ 					[(
+ 					(ignoreYounglings and: [self isYoung: classOrNil]) not
+ 					and: [(expungeUnmarked and: [(self isMarked: classOrNil) not])
+ 					   or: [(self rawHashBitsOf: classOrNil) ~= classIndex]]) ifTrue:
+ 						[self storePointerUnchecked: j
+ 							ofObject: classTablePage
+ 							withValue: nilObj.
+ 						 "but if it is marked, it should still be in the table at its correct index."
+ 						 self assert: ((expungeUnmarked and: [(self isMarked: classOrNil) not])
+ 									or: [(self classAtIndex: (self rawHashBitsOf: classOrNil)) = classOrNil]).
+ 						 "If the removed class is before the classTableIndex, set the
+ 						  classTableIndex to point to the empty slot so as to reuse it asap."
+ 						 classIndex < classTableIndex ifTrue:
+ 							[classTableIndex := classIndex]]]]].
+ 	"classTableIndex must never index the first page, which is reserved for classes known to the VM."
+ 	self assert: classTableIndex >= (1 << self classTableMajorIndexShift)!

Item was added:
+ ----- Method: SpurMemoryManager>>firstReferenceTo: (in category 'debug printing') -----
+ firstReferenceTo: anOop
+ 	"Scan the heap printing the oops of any and all objects that refer to anOop"
+ 	<export: true> "useful for VM debugging; use export: not api, so it will be accessible on win32 and won't be written to cointerp.h"
+ 	self allObjectsDo:
+ 		[:obj| | i |
+ 		 i := self numPointerSlotsOf: obj.
+ 		 [(i := i - 1) >= 0] whileTrue:
+ 			[anOop = (self fetchPointer: i ofMaybeForwardedObject: obj) ifTrue:
+ 				[^ obj]]]!

Item was changed:
  ----- Method: SpurMemoryManager>>initialize (in category 'initialization') -----
  initialize
  	"We can put all initializations that set something to 0 or to false here.
  	 In C all global variables are initialized to 0, and 0 is false."
  	| moreThanEnough |
  	remapBuffer := Array new: RemapBufferSize.
  	remapBufferCount := extraRootCount := 0. "see below"
  	freeListsMask := totalFreeOldSpace := lowSpaceThreshold := 0.
  	checkForLeaks := 0.
  	needGCFlag := signalLowSpace := false.
  	becomeEffectsFlags := gcPhaseInProgress := validatedIntegerClassFlags := 0.
  	statScavenges := statIncrGCs := statFullGCs := 0.
  	statMaxAllocSegmentTime := 0.
  	statMarkUsecs := statSweepUsecs := statScavengeGCUsecs := statIncrGCUsecs := statFullGCUsecs := statCompactionUsecs := statGCEndUsecs := gcSweepEndUsecs := 0.
  	statSGCDeltaUsecs := statIGCDeltaUsecs := statFGCDeltaUsecs := 0.
  	statGrowMemory := statShrinkMemory := statRootTableCount := statAllocatedBytes := 0.
  	statRootTableOverflows := statMarkCount := statCompactPassCount := statCoalesces := 0.
  
  	"We can initialize things that are allocated but are lazily initialized."
  	unscannedEphemerons := SpurContiguousObjStack new.
  
  	"we can initialize things that are virtual in C."
  	scavenger := SpurGenerationScavenger simulatorClass new manager: self; yourself.
  	segmentManager := SpurSegmentManager simulatorClass new manager: self; yourself.
  	compactor := self class compactorClass simulatorClass new manager: self; yourself.
  	marker := self class markerClass simulatorClass new manager: self; yourself.
  	gc := self class gcClass simulatorClass new manager: self; marker: marker; compactor: compactor; scavenger: scavenger; yourself.
  
  	"We can also initialize here anything that is only for simulation."
  	heapMap := CogCheck32BitHeapMap new.
  
  	"N.B. We *don't* initialize extraRoots because we don't simulate it."
  
  	"This is needed on 64-bits. We don't want a simulation creating a huge heap by default.
  	 By default use 512Mb on 64-bits, 256Mb on 32-bits."
  	moreThanEnough := 1024 * 1024 * 1024 / (16 / self wordSize). "One million dollars, ha ha ha ha ha,... ha, ha ha ha ha, ..."
  	maxOldSpaceSize := self class initializationOptions
  							ifNotNil: [:initOpts| initOpts at: #maxOldSpaceSize ifAbsent: [moreThanEnough]]
+ 							ifNil: [moreThanEnough].!
- 							ifNil: [moreThanEnough]!

Item was removed:
- ----- Method: SpurMemoryManager>>nilUnmarkedWeaklingSlots (in category 'weakness and ephemerality') -----
- nilUnmarkedWeaklingSlots
- 	"Nil the unmarked slots in the weaklings on the
- 	 weakling stack, finalizing those that lost references.
- 	 Finally, empty the weaklingStack."
- 	<inline: #never> "for profiling"
- 	self cCode: '' inSmalltalk: [coInterpreter transcript nextPutAll: 'nilling...'; flush].
- 	self eassert: [self allOldMarkedWeakObjectsOnWeaklingStack].
- 	weaklingStack = nilObj ifTrue:
- 		[^self].
- 	self objStack: weaklingStack from: 0 do:
- 		[:weakling| | anyUnmarked |
- 		anyUnmarked := self nilUnmarkedWeaklingSlotsIn: weakling.
- 		anyUnmarked ifTrue:
- 			["fireFinalization: could grow the mournQueue and if so,
- 			  additional pages must be marked to avoid being GC'ed."
- 			 self assert: marker marking.
- 			 coInterpreter fireFinalization: weakling]].
- 	self emptyObjStack: weaklingStack!

Item was added:
+ ----- Method: SpurMemoryManager>>nilUnmarkedWeaklingSlotsExcludingYoungObjects: (in category 'weakness and ephemerality') -----
+ nilUnmarkedWeaklingSlotsExcludingYoungObjects: aBoolean
+ 	"Nil the unmarked slots in the weaklings on the
+ 	 weakling stack, finalizing those that lost references.
+ 	 Finally, empty the weaklingStack."
+ 	<inline: #never> "for profiling"
+ 	self cCode: '' inSmalltalk: [coInterpreter transcript nextPutAll: 'nilling...'; flush].
+ 	self eassert: [self allOldMarkedWeakObjectsOnWeaklingStack].
+ 	weaklingStack = nilObj ifTrue:
+ 		[^self].
+ 	self objStack: weaklingStack from: 0 do:
+ 		[:weakling| | anyUnmarked |
+ 		anyUnmarked := self nilUnmarkedWeaklingSlotsIn: weakling excludingYoungObjects: aBoolean.
+ 		anyUnmarked ifTrue:
+ 			["fireFinalization: could grow the mournQueue and if so,
+ 			  additional pages must be marked to avoid being GC'ed."
+ 			 self assert: marker marking.
+ 			 coInterpreter fireFinalization: weakling]].
+ 	self emptyObjStack: weaklingStack!

Item was removed:
- ----- Method: SpurMemoryManager>>nilUnmarkedWeaklingSlotsIn: (in category 'weakness and ephemerality') -----
- nilUnmarkedWeaklingSlotsIn: aWeakling
- 	"Nil the unmarked slots in aWeakling and
- 	 answer if any unmarked slots were found."
- 	<inline: true>
- 	| anyUnmarked |
- 	anyUnmarked := false.
- 	self assert: (self allStrongSlotsOfWeaklingAreMarked: aWeakling).
- 	"N.B. generateToByDoLimitExpression:negative:on: guards against (unsigned)0 - 1 going +ve"
- 	(self numStrongSlotsOfWeakling: aWeakling) to: (self numSlotsOf: aWeakling) - 1 do:
- 		[:i| | referent |
- 		referent := self fetchPointer: i ofObject: aWeakling.
- 		(self isNonImmediate: referent) ifTrue:
- 			[(self isUnambiguouslyForwarder: referent) ifTrue:
- 				[referent := self fixFollowedField: i ofObject: aWeakling withInitialValue: referent].
- 			 ((self isImmediate: referent) or: [self isMarked: referent]) ifFalse:
- 				[self storePointerUnchecked: i ofObject: aWeakling withValue: nilObj.
- 				 anyUnmarked := true]]].
- 	^anyUnmarked!

Item was added:
+ ----- Method: SpurMemoryManager>>nilUnmarkedWeaklingSlotsIn:excludingYoungObjects: (in category 'weakness and ephemerality') -----
+ nilUnmarkedWeaklingSlotsIn: aWeakling excludingYoungObjects: aBoolean
+ 	"Nil the unmarked slots in aWeakling and
+ 	 answer if any unmarked slots were found."
+ 	<inline: true>
+ 	| anyUnmarked |
+ 	anyUnmarked := false.
+ 	self assert: (self allStrongSlotsOfWeaklingAreMarked: aWeakling excludingYoungObjects: aBoolean).
+ 	"N.B. generateToByDoLimitExpression:negative:on: guards against (unsigned)0 - 1 going +ve"
+ 	(self numStrongSlotsOfWeakling: aWeakling) to: (self numSlotsOf: aWeakling) - 1 do:
+ 		[:i| | referent |
+ 		referent := self fetchPointer: i ofObject: aWeakling.
+ 		(self isNonImmediate: referent) ifTrue:
+ 			[(self isUnambiguouslyForwarder: referent) ifTrue:
+ 				[referent := self fixFollowedField: i ofObject: aWeakling withInitialValue: referent].
+ 			 ((self isImmediate: referent) or: [self isMarked: referent]) ifFalse:
+ 				[((self isYoung: referent) and: [aBoolean])
+ 					ifFalse: [self storePointerUnchecked: i ofObject: aWeakling withValue: nilObj.
+ 				 			 anyUnmarked := true]]]].
+ 	^anyUnmarked!

Item was changed:
  ----- Method: SpurMemoryManager>>objectsReachableFromRoots: (in category 'image segment in/out') -----
  objectsReachableFromRoots: arrayOfRoots
  	"This is part of storeImageSegmentInto:outPointers:roots:.
  	 Answer an Array of all the objects only reachable from the argument, an Array of root objects,
  	 starting with arrayOfRoots.  If there is no space, answer a SmallInteger whose value is the
  	 number of slots required.  This is used to collect the objects to include in an image segment
  	 on Spur, separate from creating the segment, hence simplifying the implementation.
  	 Thanks to Igor Stasenko for this idea."
  
  	| freeChunk ptr start limit count oop objOop |
  	<var: #freeChunk type: #usqInt> "& hence start & ptr are too; limit is also because of addressAfter:"
  	<inline: #never>
  	self assert: (self isArray: arrayOfRoots).
  	"Mark all objects except those only reachable from the arrayOfRoots by marking
  	 each object in arrayOfRoots and then marking all reachable objects (from the
  	 system roots).  This leaves unmarked only objects reachable from the arrayOfRoots.
  	 N.B. A side-effect of the marking is that all forwarders in arrayOfRoots will be followed."
   	self assert: self allObjectsUnmarked.
  	self markObjectsIn: arrayOfRoots.
+ 	gc markObjects: false.
- 	marker markObjects: false.
  
  	"After the mark phase all unreachable weak slots will have been nilled
  	 and all active ephemerons fired."
  	self assert: (self isEmptyObjStack: markStack).
  	self assert: (self isEmptyObjStack: weaklingStack).
  	self assert: self noUnscannedEphemerons.
  
  	"Now unmark the roots before collecting the transitive closure of unmarked objects accessible from the roots."
  	self unmarkObjectsIn: arrayOfRoots.
  
  	"Use the largest free chunk to answer the result."
  	freeChunk := self allocateLargestFreeChunk. "N.B. Does /not/ update totalFreeOldSpace"
  	totalFreeOldSpace := totalFreeOldSpace - (self bytesInBody: freeChunk). "but must update so that growth in the markStack does not cause assert fails."
  	ptr := start := freeChunk + self baseHeaderSize.
  	limit := self addressAfter: freeChunk.
  	count := 0.
  
  	"First put the arrayOfRoots; order is important."
  	self noCheckPush: arrayOfRoots onObjStack: markStack.
  
  	"Now collect the roots and the transitive closure of unmarked objects from them."
  	[self isEmptyObjStack: markStack] whileFalse:
  		[objOop := self popObjStack: markStack.
  		 self assert: (self isMarked: objOop).
  		 count := count + 1.
  		 ptr < limit ifTrue:
  			[self longAt: ptr put: objOop.
  			 ptr := ptr + self bytesPerOop].
  		 oop := self fetchClassOfNonImm: objOop.
  		 (self isMarked: oop) ifFalse:
  			[self setIsMarkedOf: oop to: true.
  			 self noCheckPush: oop onObjStack: markStack].
  		 ((self isContextNonImm: objOop)
  		  and: [coInterpreter isStillMarriedContext: objOop]) "widow now, before the copy loop"
  			ifTrue:
  				[0 to: (coInterpreter numSlotsOfMarriedContext: objOop) - 1 do:
  					[:i|
  					 oop := coInterpreter fetchPointer: i ofMarriedContext: objOop.
  					 ((self isImmediate: oop)
  					  or: [self isMarked: oop]) ifFalse:
  						[self setIsMarkedOf: oop to: true.
  						 self noCheckPush: oop onObjStack: markStack]]]
  			ifFalse:
  				[0 to: (self numPointerSlotsOf: objOop) - 1 do:
  					[:i|
  					 oop := self fetchPointer: i ofObject: objOop.
  					 ((self isImmediate: oop)
  					  or: [self isMarked: oop]) ifFalse:
  						[self setIsMarkedOf: oop to: true.
  						 self noCheckPush: oop onObjStack: markStack]]]].
  
  	self unmarkAllObjects.
  
  	"Now try and allocate the result"
  	(count > (ptr - start / self bytesPerOop) "not enough room"
  	 or: [limit ~= ptr and: [limit - ptr <= self allocationUnit]]) ifTrue: "can't split a single word"
  		[self freeObject: freeChunk.
  		 self checkFreeSpace: GCCheckImageSegment.
  		 ^self integerObjectOf: count].
  	"There's room; set the format, & classIndex and shorten."
  	self setFormatOf: freeChunk to: self arrayFormat.
  	self setClassIndexOf: freeChunk to: ClassArrayCompactIndex.
  	gc maybeModifyGCFlagsOf: freeChunk.
  	self shorten: freeChunk toIndexableSize: count.
  	(self isForwarded: freeChunk) ifTrue:
  		[freeChunk := self followForwarded: freeChunk].
  	self possibleRootStoreInto: freeChunk.
  	self checkFreeSpace: GCCheckImageSegment.
  	self runLeakCheckerFor: GCCheckImageSegment.
  	^freeChunk!

Item was added:
+ ----- Method: SpurMemoryManager>>printRelativePositionOf: (in category 'debug printing') -----
+ printRelativePositionOf: obj
+ 
+ 	coInterpreter cr; print: 'Object is in segment with index: '; printNum: (segmentManager segmentIndexContainingObj: obj); tab; cr; flush.
+ 	coInterpreter cr; print: 'It''s relative position in the segment is: '; printHex: obj - (segmentManager segmentContainingObj: obj) segStart; tab; cr; flush.!

Item was added:
+ ----- Method: SpurMemoryManager>>validObjectColors (in category 'debug support') -----
+ validObjectColors
+ 
+ 	| currentSweepingEntityT |
+ 	
+ 	currentSweepingEntityT := gc compactor sweeper currentSweepingEntity ifNil: [self firstObject].
+ 	
+ 
+ 	self allOldSpaceEntitiesFrom: currentSweepingEntityT do: [:obj |
+ 		((self isMarked: obj) and: [(self isPointers: obj) and: [(self isContext: obj) not]])
+ 			ifTrue: [| slotCount |
+ 				slotCount := self numSlotsOf: obj.
+ 				
+ 				0 to: slotCount - 1
+ 					do: [:index | | slot |
+ 						slot := self fetchPointer: index ofObject: obj.
+ 						
+ 						((self isNonImmediate: slot) and: [(self isOldObject: slot) and: [(self isForwarded: slot) not]])
+ 							ifTrue: [(slot >= currentSweepingEntityT and: [(self isMarked: slot) not])
+ 										ifTrue: [self halt.
+ 											coInterpreter longPrintOop: (self firstReferenceTo:(self firstReferenceTo: obj)).
+ 											self printReferencesTo: (self firstReferenceTo: obj).
+ 											self printReferencesTo: obj.
+ 											
+ 											self printRelativePositionOf: obj.		
+ 											self printRelativePositionOf: slot.											
+ 											
+ 											coInterpreter longPrintOop: obj.
+ 											coInterpreter longPrintOop: slot.
+ 											
+ 											^ false]]]]].
+ 						
+ 					
+ 	^ true!

Item was added:
+ ----- Method: SpurMemoryManager>>validObjectColorsRelaxed (in category 'debug support') -----
+ validObjectColorsRelaxed
+ 
+ 	| currentSweepingEntityT |
+ 	
+ 	currentSweepingEntityT := gc compactor sweeper currentSweepingEntity ifNil: [self firstObject].
+ 	
+ 
+ 	self allOldSpaceObjectsFrom: currentSweepingEntityT do: [:obj |
+ 		((self isMarked: obj) and: [(self isPointers: obj )and: [(self classIndexOf: obj) > self lastClassIndexPun and: [(self isWeak: obj) not]]])
+ 			ifTrue: [| slotCount |
+ 				slotCount := self numStrongSlotsOfInephemeral: obj.
+ 				
+ 				0 to: slotCount - 1
+ 					do: [:index | | slot |
+ 						slot := self fetchPointer: index ofObject: obj.
+ 						
+ 						((self isNonImmediate: slot) and: [(self isOldObject: slot) and: [(self isForwarded: slot) not]])
+ 							ifTrue: [(slot >= currentSweepingEntityT and: [self isWhite: slot])
+ 										ifTrue: [self halt.
+ 											coInterpreter longPrintOop: obj.
+ 											coInterpreter longPrintOop: slot.
+ 											
+ 											^ false]]]] "right parenthesis expected ->"].
+ 						
+ 					
+ 	^ true!

Item was added:
+ ----- Method: SpurSegmentManager>>segmentIndexContainingObj: (in category 'accessing') -----
+ segmentIndexContainingObj: objOop
+ 	<export: true>
+ 	numSegments - 1 to: 0 by: -1 do:
+ 		[:i|
+ 		objOop >= (segments at: i) segStart ifTrue:
+ 			[^i]].
+ 	^-1!

Item was changed:
  ----- Method: SpurStopTheWorldGarbageCollector>>globalGarbageCollect (in category 'as yet unclassified') -----
  globalGarbageCollect
  	<inline: true> "inline into fullGC"
  	
  	manager preGlobalGCActions.
  	
  	self assert: self validObjStacks.
  	self assert: (self isEmptyObjStack: manager markStack).
  	self assert: (self isEmptyObjStack: manager weaklingStack).
  
  	"Mark objects /before/ scavenging, to empty the rememberedTable of unmarked roots."
+ 	self markObjects: true.
- 	marker markObjects: true.
  	manager gcMarkEndUsecs: coInterpreter ioUTCMicrosecondsNow.
  	
  	scavenger forgetUnmarkedRememberedObjects.
  
  	coInterpreter setGCMode: GCModeNewSpace.
  	self doScavenge: MarkOnTenure.
  	coInterpreter setGCMode: GCModeFull.
  
  	"Mid-way the leak check must be more lenient.  Unmarked classes will have been
  	 expunged from the table, but unmarked instances will not yet have been reclaimed."
  	manager runLeakCheckerFor: GCModeFull
  		excludeUnmarkedObjs: true
  		classIndicesShouldBeValid: true.
  
  	manager compactionStartUsecs: coInterpreter ioUTCMicrosecondsNow.
  	manager segmentManager prepareForGlobalSweep. "for notePinned:"
  	compactor compact.
  	manager attemptToShrink.
  	manager setHeapSizeAtPreviousGC.
  
  	self assert: manager validObjStacks.
  	self assert: (manager isEmptyObjStack: manager markStack).
  	self assert: (manager isEmptyObjStack: manager weaklingStack).
  	self assert: manager allObjectsUnmarked.
  	manager runLeakCheckerFor: GCModeFull!

Item was changed:
  ----- Method: StackInterpreter class>>declareCVarsIn: (in category 'translation') -----
  declareCVarsIn: aCCodeGenerator
  	| vmClass |
  	self class == thisContext methodClass ifFalse: [^self]. "Don't duplicate decls in subclasses"
  	vmClass := aCCodeGenerator vmClass. "Generate primitiveTable etc based on vmClass, not just StackInterpreter"
  	aCCodeGenerator
  		addHeaderFile: '<stdio.h> /* for printf */';
  		addHeaderFile: '<stdlib.h> /* for e.g. alloca */';
  		addHeaderFile: '<setjmp.h>';
  		addHeaderFile: '<wchar.h> /* for wint_t */';
+ 		addHeaderFile: '<signal.h>';
  		addHeaderFile: '"vmCallback.h"';
  		addHeaderFile: '"sqMemoryFence.h"';
  		addHeaderFile: '"sqImageFileAccess.h"';
  		addHeaderFile: '"sqSetjmpShim.h"';
  		addHeaderFile: '"dispdbg.h"'.
  	LowcodeVM ifTrue:
  		[aCCodeGenerator addHeaderFile: '"sqLowcodeFFI.h"'].
  
  	vmClass declareInterpreterVersionIn: aCCodeGenerator defaultName: 'Stack'.
  	aCCodeGenerator
  		var: #interpreterProxy  type: #'struct VirtualMachine*'.
  	aCCodeGenerator
  		declareVar: #sendTrace type: 'volatile int';
  		declareVar: #byteCount type: #usqLong. "see dispdbg.h"
  	"These need to be pointers or unsigned."
  	self declareC: #(instructionPointer method newMethod)
  		as: #usqInt
  		in: aCCodeGenerator.
  	"These are all pointers; char * because Slang has no support for C pointer arithmetic."
  	self declareC: #(localIP localSP localFP stackPointer framePointer stackLimit breakSelector)
  		as: #'char *'
  		in: aCCodeGenerator.
  	aCCodeGenerator
  		var: #breakSelectorLength
  		declareC: 'sqInt breakSelectorLength = MinSmallInteger'.
  	self declareC: #(stackPage overflowedPage)
  		as: #'StackPage *'
  		in: aCCodeGenerator.
  	aCCodeGenerator
  		var: #transcript type: #'FILE *'.
  	aCCodeGenerator removeVariable: 'stackPages'.  "this is an implicit receiver in the translated code."
  	"This defines bytecodeSetSelector as 0 if MULTIPLEBYTECODESETS
  	 is not defined, for the benefit of the interpreter on slow machines."
  	aCCodeGenerator addConstantForBinding: (self bindingOf: #MULTIPLEBYTECODESETS).
  	MULTIPLEBYTECODESETS == false ifTrue:
  		[aCCodeGenerator
  			removeVariable: 'bytecodeSetSelector'].
  	BytecodeSetHasExtensions == false ifTrue:
  		[aCCodeGenerator
  			removeVariable: 'extA';
  			removeVariable: 'extB'].
  	aCCodeGenerator
  		var: #methodCache
  		declareC: 'sqIntptr_t methodCache[MethodCacheSize + 1 /* ', (MethodCacheSize + 1) printString, ' */]'.
  	NewspeakVM
  		ifTrue:
  			[aCCodeGenerator
  				var: #nsMethodCache
  				declareC: 'sqIntptr_t nsMethodCache[NSMethodCacheSize + 1 /* ', (NSMethodCacheSize + 1) printString, ' */]']
  		ifFalse:
  			[aCCodeGenerator
  				removeVariable: #nsMethodCache;
  				removeVariable: 'localAbsentReceiver';
  				removeVariable: 'localAbsentReceiverOrZero'].
  	AtCacheTotalSize isInteger ifTrue:
  		[aCCodeGenerator
  			var: #atCache
  			declareC: 'sqInt atCache[AtCacheTotalSize + 1 /* ', (AtCacheTotalSize + 1) printString, ' */]'].
  	aCCodeGenerator
  		var: #primitiveTable
  		declareC: 'void (*primitiveTable[MaxPrimitiveIndex + 2 /* ', (MaxPrimitiveIndex + 2) printString, ' */])(void) = ', vmClass primitiveTableString.
  	vmClass primitiveTable do:
  		[:symbolOrNot|
  		(symbolOrNot isSymbol
  		 and: [symbolOrNot ~~ #primitiveFail]) ifTrue:
  			[(aCCodeGenerator methodNamed: symbolOrNot) ifNotNil:
  				[:tMethod| tMethod returnType: #void]]].
  
  	vmClass objectMemoryClass hasSpurMemoryManagerAPI
  		ifTrue:
  			[aCCodeGenerator
  				var: #primitiveAccessorDepthTable
  				type: 'signed char'
  				sizeString: 'MaxPrimitiveIndex + 2 /* ', (MaxPrimitiveIndex + 2) printString, ' */'
  				array: (vmClass primitiveAccessorDepthTableUsing: aCCodeGenerator).
  			 aCCodeGenerator
  				removeConstant: #PrimNumberInstVarAt;
  				removeConstant: #PrimNumberPerform;
  				removeConstant: # PrimNumberPerformWithArgs;
  				removeConstant: #PrimNumberShallowCopy;
  				removeConstant: #PrimNumberSlotAt;
  				removeConstant: #PrimNumberFlushExternalPrimitives;
  				removeConstant: #PrimNumberUnloadModule]
  		ifFalse:
  			[aCCodeGenerator
  				removeVariable: #primitiveAccessorDepthTable;
  				removeConstant: #PrimNumberVMParameter].
  
  	aCCodeGenerator
  		var: #displayBits type: #'void *';
  		var: #primitiveCalloutPointer declareC: 'void *primitiveCalloutPointer = (void *)-1'.
  	#('primitiveDoMixedArithmetic' 'upscaleDisplayIfHighDPI' ) do:
  		 [:var|
  		aCCodeGenerator
  			var: var
  			declareC: 'sqInt ', var, ' = -1'].
  	self declareC: #(displayWidth displayHeight displayDepth) as: #int in: aCCodeGenerator.
  	aCCodeGenerator
  		var: #primitiveFunctionPointer
  			declareC: 'void (*primitiveFunctionPointer)()';
  			var: 'pcPreviousToFunction'
  				declareC: 'sqInt (* const pcPreviousToFunction)(sqInt,sqInt) = ', (aCCodeGenerator cFunctionNameFor: PCPreviousToFunction);
  		var: #externalPrimitiveTable
  			declareC: 'void (*externalPrimitiveTable[MaxExternalPrimitiveTableSize + 1 /* ', (MaxExternalPrimitiveTableSize + 1) printString, ' */])(void)';
  		var: #interruptCheckChain
  			declareC: 'void (*interruptCheckChain)(void) = 0';
  		var: #showSurfaceFn
  			declareC: 'int (*showSurfaceFn)(sqIntptr_t, int, int, int, int)'.
  
  	self declareCAsUSqLong: #(nextPollUsecs nextWakeupUsecs
  								"these are high-frequency enough that they're overflowing quite quickly on modern hardware"
  								statProcessSwitch statIOProcessEvents statForceInterruptCheck
  								statCheckForEvents statStackOverflow statStackPageDivorce
  								statIdleUsecs)
  		in: aCCodeGenerator.
  	aCCodeGenerator var: #nextProfileTick type: #sqLong.
  	aCCodeGenerator var: #reenterInterpreter type: 'jmp_buf'.
  	LowcodeVM
  		ifTrue:
  			[aCCodeGenerator
  				var: #lowcodeCalloutState type: #'sqLowcodeCalloutState*'.
  			 self declareC: #(nativeSP nativeStackPointer shadowCallStackPointer)
  				as: #'char *'
  				in: aCCodeGenerator]
  		ifFalse:
  			[#(lowcodeCalloutState nativeSP nativeStackPointer shadowCallStackPointer) do:
  				[:var| aCCodeGenerator removeVariable: var]].
  	(self instVarNames select: [:ivn| ivn beginsWith: 'longRunningPrimitive']) do:
  		[:lrpmVar|
  		aCCodeGenerator
  			var: lrpmVar
  			declareC: '#if LRPCheck\', ((lrpmVar endsWith: 'Usecs') ifTrue: [#usqLong] ifFalse: [#sqInt]), ' ', lrpmVar, '\#endif']!



More information about the Vm-dev mailing list