[Vm-dev] VM Maker: VMMaker.oscog-eem.465.mcz

commits at source.squeak.org commits at source.squeak.org
Thu Oct 17 14:04:14 UTC 2013


Eliot Miranda uploaded a new version of VMMaker to project VM Maker:
http://source.squeak.org/VMMaker/VMMaker.oscog-eem.465.mcz

==================== Summary ====================

Name: VMMaker.oscog-eem.465
Author: eem
Time: 17 October 2013, 6:59:21.625 am
UUID: 76822de3-f107-4e28-93d6-f49bfdf3bbb8
Ancestors: VMMaker.oscog-eem.464

Add room for a small number of hidden roots for things like the
ephemeron queue to the class table root object.  Hence rename
classTableRootObj to hiddenRootsObj & classTableRootObj: to
hiddenRootsObj:.

Fix bugs in allocateOldSpaceChunkOfBytes:, ofObject: =>
ofFreeChunk:, when adding back into root both children can be null.

Change call of sqAllocateMemorySegmentOfSize:Above:AllocatedSizeInto:
to use end of first segment as minAddress instead of
newSpaceLimit.  Hopefully will play better with mmap et al.

Fix comment in initializeSpecialObjectIndices.

=============== Diff against VMMaker.oscog-eem.464 ===============

Item was changed:
  ----- Method: ObjectMemory class>>initializeSpecialObjectIndices (in category 'initialization') -----
  initializeSpecialObjectIndices
  	"Initialize indices into specialObjects array."
  
  	NilObject := 0.
  	FalseObject := 1.
  	TrueObject := 2.
  	SchedulerAssociation := 3.
  	ClassBitmap := 4.
  	ClassInteger := 5.
  	ClassByteString := ClassString := 6. "N.B.  Actually class ByteString"
  	ClassArray := 7.
  	"SmalltalkDictionary := 8."  "Do not delete!!"
  	ClassFloat := 9.
  	ClassMethodContext := 10.
  	ClassBlockContext := 11.
  	ClassPoint := 12.
  	ClassLargePositiveInteger := 13.
  	TheDisplay := 14.
  	ClassMessage := 15.
  	"ClassCompiledMethod := 16. unused by the VM"
  	TheLowSpaceSemaphore := 17.
  	ClassSemaphore := 18.
  	ClassCharacter := 19.
  	SelectorDoesNotUnderstand := 20.
  	SelectorCannotReturn := 21.
  	ProcessSignalingLowSpace := 22.	"was TheInputSemaphore"
  	SpecialSelectors := 23.
  	CharacterTable := 24.
  	SelectorMustBeBoolean := 25.
  	ClassByteArray := 26.
  	"ClassProcess := 27. unused"
  	CompactClasses := 28.
  	TheTimerSemaphore := 29.
  	TheInterruptSemaphore := 30.
  	SelectorCannotInterpret := 34.
  	"Was MethodContextProto := 35."
  	ClassBlockClosure := 36.
  	"Was BlockContextProto := 37."
  	ExternalObjectsArray := 38.
  	ClassMutex := 39.
  	"Was: ClassTranslatedMethod := 40."
  	ProcessInExternalCodeTag := 40.
  	TheFinalizationSemaphore := 41.
  	ClassLargeNegativeInteger := 42.
  
  	ClassExternalAddress := 43.
  	ClassExternalStructure := 44.
  	ClassExternalData := 45.
  	ClassExternalFunction := 46.
  	ClassExternalLibrary := 47.
  
  	SelectorAboutToReturn := 48.
  	SelectorRunWithIn := 49.
  
  	SelectorAttemptToAssign := 50.
+ 	"PrimErrTableIndex := 51. in VMClass class>>initializePrimitiveErrorCodes"
- 	"PrimErrTableIndex := 51. in Interpreter class>>initializePrimitiveErrorCodes"
  	ClassAlien := 52.
  	SelectorInvokeCallback := 53.
  	ClassUnsafeAlien := 54.
  
  	ClassWeakFinalizer := 55.
  
  	ForeignCallbackProcess := 56.
  
  	SelectorUnknownBytecode := 57.
  	SelectorCounterTripped := 58
  !

Item was changed:
  CogClass subclass: #SpurMemoryManager
(excessive size, no diff calculated)

Item was changed:
  ----- Method: SpurMemoryManager class>>initializeSpecialObjectIndices (in category 'class initialization') -----
  initializeSpecialObjectIndices
  	"Initialize indices into specialObjects array."
  
  	NilObject := 0.
  	FalseObject := 1.
  	TrueObject := 2.
  	SchedulerAssociation := 3.
  	ClassBitmap := 4.
  	ClassInteger := 5.
  	ClassByteString := ClassString := 6. "N.B.  Actually class ByteString"
  	ClassArray := 7.
  	"SmalltalkDictionary := 8."  "Do not delete!!"
  	ClassFloat := 9.
  	ClassMethodContext := 10.
  	"ClassBlockContext := 11. unused by the VM"
  	ClassPoint := 12.
  	ClassLargePositiveInteger := 13.
  	TheDisplay := 14.
  	ClassMessage := 15.
  	"ClassCompiledMethod := 16. unused by the VM"
  	TheLowSpaceSemaphore := 17.
  	ClassSemaphore := 18.
  	ClassCharacter := 19.
  	SelectorDoesNotUnderstand := 20.
  	SelectorCannotReturn := 21.
  	ProcessSignalingLowSpace := 22.	"was TheInputSemaphore"
  	SpecialSelectors := 23.
  	CharacterTable := nil.	"Must be unused by the VM"
  	SelectorMustBeBoolean := 25.
  	ClassByteArray := 26.
  	"ClassProcess := 27. unused"
  	CompactClasses := 28.
  	TheTimerSemaphore := 29.
  	TheInterruptSemaphore := 30.
  	SelectorCannotInterpret := 34.
  	"Was MethodContextProto := 35."
  	ClassBlockClosure := 36.
  	"Was BlockContextProto := 37."
  	ExternalObjectsArray := 38.
  	ClassMutex := 39.
  	"Was: ClassTranslatedMethod := 40."
  	ProcessInExternalCodeTag := 40.
  	TheFinalizationSemaphore := 41.
  	ClassLargeNegativeInteger := 42.
  
  	ClassExternalAddress := 43.
  	ClassExternalStructure := 44.
  	ClassExternalData := 45.
  	ClassExternalFunction := 46.
  	ClassExternalLibrary := 47.
  
  	SelectorAboutToReturn := 48.
  	SelectorRunWithIn := 49.
  
  	SelectorAttemptToAssign := 50.
+ 	"PrimErrTableIndex := 51. in VMClass class>>initializePrimitiveErrorCodes"
- 	"PrimErrTableIndex := 51. in Interpreter class>>initializePrimitiveErrorCodes"
  	ClassAlien := 52.
  	SelectorInvokeCallback := 53.
  	ClassUnsafeAlien := 54.
  
  	ClassWeakFinalizer := 55.
  
  	ForeignCallbackProcess := 56.
  
  	SelectorUnknownBytecode := 57.
  	SelectorCounterTripped := 58!

Item was changed:
+ ----- Method: SpurMemoryManager>>allocateMemoryOfSize:newSpaceSize:stackSize:codeSize: (in category 'spur bootstrap') -----
- ----- Method: SpurMemoryManager>>allocateMemoryOfSize:newSpaceSize:stackSize:codeSize: (in category 'simulation') -----
  allocateMemoryOfSize: memoryBytes newSpaceSize: newSpaceBytes stackSize: stackBytes codeSize: codeBytes
  	"Intialize the receiver for bootsraping an image.
  	 Set up a large oldSpace and an empty newSpace and set-up freeStart and scavengeThreshold
  	 to allocate in oldSpace.  Later on (in initializePostBootstrap) freeStart and scavengeThreshold
  	 will be set to sane values."
  	<doNotGenerate>
  	| endBridgeBytes |
  	self assert: (memoryBytes \\ self allocationUnit = 0
  				and: [newSpaceBytes \\ self allocationUnit = 0
  				and: [codeBytes \\ self allocationUnit = 0]]).
  	endBridgeBytes := 2 * self baseHeaderSize.
  	memory := (self endianness == #little
  					ifTrue: [LittleEndianBitmap]
  					ifFalse: [Bitmap]) new: (memoryBytes + newSpaceBytes + codeBytes + stackBytes + endBridgeBytes) // 4.
  	startOfMemory := codeBytes + stackBytes.
  	endOfMemory := freeOldSpaceStart := memoryBytes + newSpaceBytes + codeBytes + stackBytes.
  	"leave newSpace empty for the bootstrap"
  	freeStart := newSpaceBytes + startOfMemory.
  	newSpaceLimit := newSpaceBytes + startOfMemory.
  	scavengeThreshold := memory size * 4. "Bitmap is a 4-byte per word array"
  	scavenger := SpurGenerationScavengerSimulator new
  					manager: self
  					newSpaceStart: startOfMemory
  					newSpaceBytes: newSpaceBytes
  					edenBytes: newSpaceBytes * self scavengerDenominator - self numSurvivorSpaces // self scavengerDenominator!

Item was changed:
  ----- Method: SpurMemoryManager>>allocateOldSpaceChunkOfBytes: (in category 'free space') -----
  allocateOldSpaceChunkOfBytes: chunkBytes
  	"Answer a chunk of oldSpace from the free lists, if available,
  	 otherwise answer nil.  Break up a larger chunk if one of the
  	 exact size does not exist.  N.B.  the chunk is simply a pointer, it
  	 has no valid header.  The caller *must* fill in the header correctly."
  	| initialIndex chunk index nodeBytes parent child smaller larger |
  	"for debugging:" "totalFreeOldSpace := self totalFreeListBytes"
  	totalFreeOldSpace := totalFreeOldSpace - chunkBytes. "be optimistic (& don't wait for the write)"
  	initialIndex := chunkBytes / self allocationUnit.
  	(initialIndex < self numFreeLists and: [1 << initialIndex <= freeListsMask]) ifTrue:
  		[(freeListsMask anyMask: 1 << initialIndex) ifTrue:
  			[(chunk := freeLists at: initialIndex) ~= 0 ifTrue:
  				[self assert: chunk = (self startOfObject: chunk).
  				 self assert: (self isValidFreeObject: chunk).
  				^self unlinkFreeChunk: chunk atIndex: initialIndex].
  			 freeListsMask := freeListsMask - (1 << initialIndex)].
  		 "first search for free chunks of a multiple of chunkBytes in size"
  		 index := initialIndex.
  		 [(index := index + index) < self numFreeLists
  		  and: [1 << index <= freeListsMask]] whileTrue:
  			[((freeListsMask anyMask: 1 << index)
  			 and: [(chunk := freeLists at: index) ~= 0]) ifTrue:
  				[self assert: chunk = (self startOfObject: chunk).
  				 self assert: (self isValidFreeObject: chunk).
  				 self unlinkFreeChunk: chunk atIndex: index.
  				 self assert: (self bytesInObject: chunk) = (index * self allocationUnit).
  				 self freeChunkWithBytes: index * self allocationUnit - chunkBytes
  					at: (self startOfObject: chunk) + chunkBytes.
  				^chunk]].
  		 "now get desperate and use the first that'll fit.
  		  Note that because the minimum free size is 16 bytes (2 * allocationUnit), to
  		  leave room for the forwarding pointer/next free link, we can only break chunks
  		  that are at least 16 bytes larger, hence start at initialIndex + 2."
  		 index := initialIndex + 1.
  		 [(index := index + 1) < self numFreeLists
  		  and: [1 << index <= freeListsMask]] whileTrue:
  			[(freeListsMask anyMask: 1 << index) ifTrue:
  				[(chunk := freeLists at: index) ~= 0 ifTrue:
  					[self assert: chunk = (self startOfObject: chunk).
  					 self assert: (self isValidFreeObject: chunk).
  					 self unlinkFreeChunk: chunk atIndex: index.
  					 self assert: (self bytesInObject: chunk) = (index * self allocationUnit).
  					 self freeChunkWithBytes: index * self allocationUnit - chunkBytes
  						at: (self startOfObject: chunk) + chunkBytes.
  					^chunk].
  				 freeListsMask := freeListsMask - (1 << index)]]].
  
  	"Large chunk, or no space on small free lists.  Search the large chunk list.
  	 Large chunk list organized as a tree, each node of which is a list of chunks
  	 of the same size. Beneath the node are smaller and larger blocks.
  	 When the search ends parent should hold the smallest chunk at least as
  	 large as chunkBytes, or 0 if none."
  	parent := 0.
  	child := freeLists at: 0.
  	[child ~= 0] whileTrue:
  		[| childBytes |
  		 self assert: (self isValidFreeObject: child).
  		 childBytes := self bytesInObject: child.
  		 childBytes = chunkBytes
  			ifTrue: "size match; try to remove from list at node."
  				[chunk := self fetchPointer: self freeChunkNextIndex
  								ofFreeChunk: child.
  				 chunk ~= 0 ifTrue:
  					[self assert: (self isValidFreeObject: chunk).
  					 self storePointer: self freeChunkNextIndex
  						ofFreeChunk: child
  						withValue: (self fetchPointer: self freeChunkNextIndex
  										ofFreeChunk: chunk).
  					 ^self startOfObject: chunk].
  				 child := 0] "break out of loop to remove interior node"
  			ifFalse:
  				["Note that because the minimum free size is 16 bytes (2 * allocationUnit), to
  				  leave room for the forwarding pointer/next free link, we can only break chunks
  				  that are at least 16 bytes larger, hence reject chunks < 2 * allocationUnit larger."
  				childBytes <= (chunkBytes + self allocationUnit)
  					ifTrue: "node too small; walk down the larger size of the tree"
  						[child := self fetchPointer: self freeChunkLargerIndex ofFreeChunk: child]
  					ifFalse:
  						[parent := child. "parent will be smallest node >= chunkBytes + allocationUnit"
  						 nodeBytes := childBytes.
  						 child := self fetchPointer: self freeChunkSmallerIndex ofFreeChunk: child]]].
  	parent = 0 ifTrue:
  		[totalFreeOldSpace := totalFreeOldSpace + chunkBytes. "optimism was unfounded"
  		 ^nil].
  
  	"self printFreeChunk: parent"
  	self assert: (nodeBytes = chunkBytes or: [nodeBytes >= (chunkBytes + (2 * self allocationUnit))]).
  	self assert: (self bytesInObject: parent) = nodeBytes.
  
  	"attempt to remove from list"
  	chunk := self fetchPointer: self freeChunkNextIndex
  					ofFreeChunk: parent.
  	chunk ~= 0 ifTrue:
  		[self assert: (chunkBytes = nodeBytes or: [chunkBytes + self allocationUnit < nodeBytes]).
  		 self storePointer: self freeChunkNextIndex
  			ofFreeChunk: parent
  			withValue: (self fetchPointer: self freeChunkNextIndex
  							ofFreeChunk: chunk).
  		 chunkBytes ~= nodeBytes ifTrue:
  			[self freeChunkWithBytes: nodeBytes - chunkBytes
  					at: (self startOfObject: chunk) + chunkBytes].
  		 ^self startOfObject: chunk].
  
  	"no list; remove an interior node; reorder tree simply.  two cases (which have mirrors, for four total):
  	 case 1. interior node has one child, P = parent, N = node, S = subtree (mirrored for large vs small)
  			___				  ___
  			| P |				  | P |
  		    _/_				_/_
  		    | N |		=>		| S |
  		 _/_
  		 | S |
  
  	 case 2: interior node has two children, , P = parent, N = node, L = smaller, left subtree, R = larger, right subtree.
  	 add the left subtree to the bottom left of the right subtree (mirrored for large vs small) 
  			___				  ___
  			| P |				  | P |
  		    _/_				_/_
  		    | N |		=>		| R |
  		 _/_  _\_		    _/_
  		 | L | | R |		    | L |"
  
  	chunk := parent.
  	smaller := self fetchPointer: self freeChunkSmallerIndex ofFreeChunk: chunk.
  	larger := self fetchPointer: self freeChunkLargerIndex ofFreeChunk: chunk.
  	parent := self fetchPointer: self freeChunkParentIndex ofFreeChunk: chunk.
  	parent = 0
  		ifTrue: "no parent; stitch the subnodes back into the root"
  			[smaller = 0
  				ifTrue:
  					[self storePointer: self freeChunkParentIndex ofFreeChunk: larger withValue: 0.
  					 freeLists at: 0 put: larger]
  				ifFalse:
  					[self storePointer: self freeChunkParentIndex ofFreeChunk: smaller withValue: 0.
  					 freeLists at: 0 put: smaller.
  					 larger ~= 0 ifTrue:
  						[self addFreeSubTree: larger]]]
  		ifFalse: "parent; stitch back into appropriate side of parent."
  			[smaller = 0
+ 				ifTrue:
+ 					[self storePointer: (chunk = (self fetchPointer: self freeChunkSmallerIndex ofFreeChunk: parent)
+ 										ifTrue: [self freeChunkSmallerIndex]
+ 										ifFalse: [self freeChunkLargerIndex])
- 				ifTrue: [self storePointer: (chunk = (self fetchPointer: self freeChunkSmallerIndex ofFreeChunk: parent)
- 											ifTrue: [self freeChunkSmallerIndex]
- 											ifFalse: [self freeChunkLargerIndex])
  							ofFreeChunk: parent
  							withValue: larger.
+ 					 larger ~= 0 ifTrue:
+ 						[self storePointer: self freeChunkParentIndex
+ 							ofFreeChunk: larger
+ 							withValue: parent]]
- 						self storePointer: self freeChunkParentIndex
- 							ofObject: larger
- 							withValue: parent]
  				ifFalse:
  					[self storePointer: (chunk = (self fetchPointer: self freeChunkSmallerIndex ofFreeChunk: parent)
  											ifTrue: [self freeChunkSmallerIndex]
  											ifFalse: [self freeChunkLargerIndex])
  						ofFreeChunk: parent
  						withValue: smaller.
  					 self storePointer: self freeChunkParentIndex
+ 						ofFreeChunk: smaller
- 						ofObject: smaller
  						withValue: parent.
  					 larger ~= 0 ifTrue:
  						[self addFreeSubTree: larger]]].
  	"if there's space left over, add the fragment back."
  	chunkBytes ~= nodeBytes ifTrue:
  		[self freeChunkWithBytes: nodeBytes - chunkBytes
  				at: (self startOfObject: chunk) + chunkBytes].
  	^self startOfObject: chunk!

Item was changed:
  ----- Method: SpurMemoryManager>>classAtIndex: (in category 'class table') -----
  classAtIndex: classIndex
  	| classTablePage |
  	self assert: (classIndex <= self tagMask or: [classIndex >= self arrayClassIndexPun]).
  	classTablePage := self fetchPointer: classIndex >> self classTableMajorIndexShift
+ 							ofObject: hiddenRootsObj.
- 							ofObject: classTableRootObj.
  	classTablePage = nilObj ifTrue:
  		[^nil].
  	^self
  		fetchPointer: (classIndex bitAnd: self classTableMinorIndexMask)
  		ofObject: classTablePage!

Item was changed:
  ----- Method: SpurMemoryManager>>classAtIndex:put: (in category 'class table') -----
  classAtIndex: classIndex put: objOop
  	"for become & GC of classes"
  	| classTablePage |
  	self assert: (classIndex <= self tagMask or: [classIndex >= self arrayClassIndexPun]).
  	self assert: (objOop = nilObj or: [(self rawHashBitsOf: objOop) = classIndex]).
  	classTablePage := self fetchPointer: classIndex >> self classTableMajorIndexShift
+ 							ofObject: hiddenRootsObj.
- 							ofObject: classTableRootObj.
  	classTablePage = nilObj ifTrue:
  		[self error: 'attempt to add class to empty page'].
  	^self
  		storePointer: (classIndex bitAnd: self classTableMinorIndexMask)
  		ofObject: classTablePage
  		withValue: objOop!

Item was changed:
+ ----- Method: SpurMemoryManager>>classTableIndex: (in category 'spur bootstrap') -----
- ----- Method: SpurMemoryManager>>classTableIndex: (in category 'accessing') -----
  classTableIndex: n
  	classTableIndex := n!

Item was changed:
+ ----- Method: SpurMemoryManager>>classTableObjectsDo: (in category 'spur bootstrap') -----
- ----- Method: SpurMemoryManager>>classTableObjectsDo: (in category 'object enumeration') -----
  classTableObjectsDo: aBlock
+ 	"for the bootstrap..."
+ 	<doNotGenerate>
+ 	0 to: self classTableRootSlots - 1 do:
- 	0 to: (self numSlotsOf: classTableRootObj) - 1 do:
  		[:i| | page |
+ 		page := self fetchPointer: i ofObject: hiddenRootsObj.
- 		page := self fetchPointer: i ofObject: classTableRootObj.
  		0 to: (self numSlotsOf: page) - 1 do:
  			[:j| | classOrNil |
  			classOrNil := self fetchPointer: j ofObject: page.
  			classOrNil ~= nilObj ifTrue:
  				[aBlock value: classOrNil]]]!

Item was changed:
  ----- Method: SpurMemoryManager>>classTableRootObj (in category 'accessing') -----
  classTableRootObj
+ 	"For Cogit & bootstrap"
+ 	^hiddenRootsObj!
- 	"For mapInterpreterOops & bootstrap"
- 	^classTableRootObj!

Item was removed:
- ----- Method: SpurMemoryManager>>classTableRootObj: (in category 'class table') -----
- classTableRootObj: anOop
- 	classTableRootObj := anOop.
- 	classTableFirstPage := self fetchPointer: 0 ofObject: classTableRootObj.
- 	self assert: (self numSlotsOf: classTableRootObj) = self classTableRootSlots.
- 	self assert: (self numSlotsOf: classTableFirstPage) - 1 = self classTableMinorIndexMask.
- 	"Set classTableIndex to the start of the last used page (excepting first page).
- 	 Set numClassTablePages to the number of used pages."
- 	numClassTablePages := self numSlotsOf: classTableRootObj.
- 	2 to: numClassTablePages - 1 do:
- 		[:i|
- 		(self fetchPointer: i ofObject: classTableRootObj) = nilObj ifTrue:
- 			[numClassTablePages := i.
- 			 classTableIndex := (numClassTablePages - 1 max: 1) << self classTableMajorIndexShift.
- 			 ^self]].
- 	"no unused pages; set it to the start of the second page."
- 	classTableIndex := 1 << self classTableMajorIndexShift!

Item was changed:
  ----- Method: SpurMemoryManager>>classTableRootSlots (in category 'class table') -----
  classTableRootSlots
+ 	"Answer the number of slots for class table pages in the hidden root object."
- 	"Answer the number of slots in the root of the class table."
  	^1 << (self classIndexFieldWidth - self classTableMajorIndexShift)!

Item was changed:
  ----- Method: SpurMemoryManager>>countNumClassPagesPreSwizzle: (in category 'class table') -----
  countNumClassPagesPreSwizzle: bytesToShift
  	"Compute the used size of the class table before swizzling.  Needed to
  	 initialize the classTableBitmap which is populated during adjustAllOopsBy:"
  	<returnTypeC: #void>
  	| firstObj classTableRoot nilObjPreSwizzle |
  	firstObj := self objectStartingAt: newSpaceLimit. "a.k.a. nilObj"
  	"first five objects are nilObj, falseObj, trueObj, freeListsObj, classTableRootObj"
  	classTableRoot := self objectAfter:
  							(self objectAfter:
  									(self objectAfter:
  											(self objectAfter: firstObj
  												limit: freeOldSpaceStart)
  										limit: freeOldSpaceStart)
  								limit: freeOldSpaceStart)
  							limit: freeOldSpaceStart.
  	nilObjPreSwizzle := newSpaceLimit - bytesToShift.
  	numClassTablePages := self numSlotsOf: classTableRoot.
+ 	self assert: numClassTablePages = (self classTableRootSlots + self hiddenRootSlots).
- 	self assert: numClassTablePages = self classTableRootSlots.
  	2 to: numClassTablePages - 1 do:
  		[:i|
  		(self fetchPointer: i ofObject: classTableRoot) = nilObjPreSwizzle ifTrue:
  			[numClassTablePages := i.
  			 ^self]]
  	!

Item was changed:
  ----- Method: SpurMemoryManager>>enterIntoClassTable: (in category 'class table') -----
  enterIntoClassTable: aBehavior
  	"Enter aBehavior into the class table and answer 0.  Otherwise answer a primitive failure code."
  	<inline: false>
  	| initialMajorIndex majorIndex minorIndex page |
  	majorIndex := classTableIndex >> self classTableMajorIndexShift.
  	initialMajorIndex := majorIndex.
  	"classTableIndex should never index the first page; it's reserved for known classes"
  	self assert: initialMajorIndex > 0.
  	minorIndex := classTableIndex bitAnd: self classTableMinorIndexMask.
  
+ 	[page := self fetchPointer: majorIndex ofObject: hiddenRootsObj.
- 	[page := self fetchPointer: majorIndex ofObject: classTableRootObj.
  	 page = nilObj ifTrue:
  		[page := self allocateSlotsInOldSpace: self classTablePageSize
  					format: self arrayFormat
  					classIndex: self arrayClassIndexPun.
  		 page ifNil:
  			[^PrimErrNoMemory].
  		 self fillObj: page numSlots: self classTablePageSize with: nilObj.
  		 self storePointer: majorIndex
+ 			ofObject: hiddenRootsObj
- 			ofObject: classTableRootObj
  			withValue: page.
  		 numClassTablePages := numClassTablePages + 1.
  		 minorIndex := 0].
  	 minorIndex to: self classTablePageSize - 1 do:
  		[:i|
  		(self fetchPointer: i ofObject: page) = nilObj ifTrue:
  			[classTableIndex := majorIndex << self classTableMajorIndexShift + i.
  			 self storePointer: i
  				ofObject: page
  				withValue: aBehavior.
  			 self setHashBitsOf: aBehavior to: classTableIndex.
  			 self assert: (self classAtIndex: (self rawHashBitsOf: aBehavior)) = aBehavior.
  			 "now fault-in method lookup chain."
  			 self scanClassPostBecome: aBehavior
  				effects: BecamePointerObjectFlag+BecameCompiledMethodFlag.
  			 self ensureAdequateClassTableBitmap.
  			 ^0]].
  	 majorIndex := (majorIndex + 1 bitAnd: self classIndexMask) max: 1.
  	 majorIndex = initialMajorIndex ifTrue: "wrapped; table full"
  		[^PrimErrLimitExceeded]] repeat!

Item was added:
+ ----- Method: SpurMemoryManager>>ephemeronQueue (in category 'garbage collection') -----
+ ephemeronQueue
+ 	"The ephemeron queue is the first hidden root after the class table pages."
+ 	^self fetchPointer: self numClassTablePages ofObject: hiddenRootsObj!

Item was added:
+ ----- Method: SpurMemoryManager>>ephemeronQueue: (in category 'garbage collection') -----
+ ephemeronQueue: anObject
+ 	"The ephemeron queue is the first hidden root after the class table pages."
+ 	self storePointer: self numClassTablePages ofObject: hiddenRootsObj withValue: anObject!

Item was changed:
  ----- Method: SpurMemoryManager>>expungeDuplicateClasses (in category 'class table') -----
  expungeDuplicateClasses
  	"Bits have been set in the classTableBitmap corresponding to
  	 used classes.  Any class in the class table that does not have a
  	 bit set has no instances with that class index.  However, becomeForward:
  	 can create duplicate entries, and these duplicate entries
  		a) won't have a bit set on load (because there are no forwarders on load),
  		b) wont match their identityHash.
  	 So expunge duplicates by eliminating unmarked entries that don't occur at
  	 their identityHash."
  	1 to: numClassTablePages - 1 do:
  		[:i| | classTablePage |
  		"optimize scan by only scanning bitmap in regions that have pages."
+ 		classTablePage := self fetchPointer: i ofObject: hiddenRootsObj.
- 		classTablePage := self fetchPointer: i ofObject: classTableRootObj.
  		classTablePage ~= nilObj ifTrue:
  			[i << self classTableMajorIndexShift
  				to: i << self classTableMajorIndexShift + self classTableMinorIndexMask
  				by: 8
  				do: [:majorBitIndex| | byteIndex byte classIndex classOrNil |
  					"optimize scan by scanning a byte of indices (8 indices) at a time"
  					byteIndex := majorBitIndex / BitsPerByte.
  					byte := classTableBitmap at: byteIndex.
  					byte ~= 255 ifTrue:
  						[0 to: 7 do:
  							[:minorBitIndex|
  							(byte noMask: 1 << minorBitIndex) ifTrue:
  								[classIndex := majorBitIndex + minorBitIndex.
  								 classOrNil := self fetchPointer: (classIndex bitAnd: self classTableMinorIndexMask)
  												   ofObject: classTablePage.
  								 self assert: (self classAtIndex: classIndex) = classOrNil.
  								 "only remove a class if it is at a duplicate entry"
  								 (classOrNil ~= nilObj
  								  and: [(self rawHashBitsOf: classOrNil) ~= classIndex]) ifTrue:
  									[self storePointerUnchecked: (classIndex bitAnd: self classTableMinorIndexMask)
  										ofObject: classTablePage
  										withValue: nilObj.
  									 "but it should still be in the table at its correct index."
  									 self assert: ((self classAtIndex: (self rawHashBitsOf: classOrNil)) = classOrNil)]]]]]]]!

Item was changed:
  ----- Method: SpurMemoryManager>>expungeFromClassTable: (in category 'class table') -----
  expungeFromClassTable: aBehavior
  	"Remove aBehavior from the class table."
  	<inline: false>
  	| classIndex majorIndex minorIndex classTablePage |
  	self assert: (self isInClassTable: aBehavior).
  	classIndex := self rawHashBitsOf: aBehavior.
  	majorIndex := classIndex >> self classTableMajorIndexShift.
  	minorIndex := classIndex bitAnd: self classTableMinorIndexMask.
+ 	classTablePage := self fetchPointer: majorIndex ofObject: hiddenRootsObj.
- 	classTablePage := self fetchPointer: majorIndex ofObject: classTableRootObj.
  	self assert: classTablePage ~= classTableFirstPage.
  	self assert: (self numSlotsOf: classTablePage) = self classTablePageSize.
  	self assert: (self fetchPointer: minorIndex ofObject: classTablePage) = aBehavior.
  	self storePointerUnchecked: minorIndex ofObject: classTablePage withValue: nilObj.
  	"If the removed class is before the classTableIndex, set the
  	 classTableIndex to point to the empty slot so as to reuse it asap."
  	classIndex < classTableIndex ifTrue:
  		[classTableIndex := classIndex]!

Item was added:
+ ----- Method: SpurMemoryManager>>hiddenRootSlots (in category 'class table') -----
+ hiddenRootSlots
+ 	"Answer the number of extra root slots in the root of the hidden root object."
+ 	^8!

Item was added:
+ ----- Method: SpurMemoryManager>>hiddenRootsObj: (in category 'class table') -----
+ hiddenRootsObj: anOop
+ 	hiddenRootsObj := anOop.
+ 	classTableFirstPage := self fetchPointer: 0 ofObject: hiddenRootsObj.
+ 	self assert: (self numSlotsOf: hiddenRootsObj) = (self classTableRootSlots + self hiddenRootSlots).
+ 	self assert: (self numSlotsOf: classTableFirstPage) - 1 = self classTableMinorIndexMask.
+ 	self cCode: [self assert: self validClassTableRootPages]
+ 		inSmalltalk: [numClassTablePages ifNotNil:
+ 						[self assert: self validClassTableRootPages]]..
+ 	"Set classTableIndex to the start of the last used page (excepting first page).
+ 	 Set numClassTablePages to the number of used pages."
+ 	numClassTablePages := self classTableRootSlots.
+ 	2 to: numClassTablePages - 1 do:
+ 		[:i|
+ 		(self fetchPointer: i ofObject: hiddenRootsObj) = nilObj ifTrue:
+ 			[numClassTablePages := i.
+ 			 classTableIndex := (numClassTablePages - 1 max: 1) << self classTableMajorIndexShift.
+ 			 ^self]].
+ 	"no unused pages; set it to the start of the second page."
+ 	classTableIndex := 1 << self classTableMajorIndexShift!

Item was changed:
+ ----- Method: SpurMemoryManager>>initializePostBootstrap (in category 'spur bootstrap') -----
- ----- Method: SpurMemoryManager>>initializePostBootstrap (in category 'simulation') -----
  initializePostBootstrap
  	"The heap has just been bootstrapped into a modified newSpace occupying all of memory
  	 above newSpace (and the codeZone). Put things back to some kind of normalcy."
  	freeOldSpaceStart := freeStart.
  	freeStart := scavenger eden start.
  	pastSpaceStart := scavenger pastSpace start.
  	scavengeThreshold := scavenger eden limit - (scavenger edenBytes / 64)!

Item was changed:
  ----- Method: SpurMemoryManager>>lookupAddress: (in category 'simulation only') -----
  lookupAddress: address
  	"If address appears to be that of a Symbol or a few well-known objects (such as classes) answer it, otherwise answer nil.
  	 For code disassembly"
  	<doNotGenerate>
  	| fmt size string class classSize maybeThisClass classNameIndex thisClassIndex |
  	(self addressCouldBeObj: address) ifFalse:
  		[^nil].
+ 	address - self baseHeaderSize = hiddenRootsObj ifTrue:
+ 		[^'(hiddenRootsObj+baseHeaderSize)'].
- 	address - self baseHeaderSize = classTableRootObj ifTrue:
- 		[^'(classTableRoot+baseHeaderSize)'].
  	fmt := self formatOf: address.
  	size := self lengthOf: address baseHeader: (self baseHeader: address) format: fmt.
  	size = 0 ifTrue:
  		[^address caseOf: { [nilObj] -> ['nil']. [trueObj] -> ['true']. [falseObj] -> ['false'] } otherwise: []].
  	((fmt between: self firstByteFormat and: self firstCompiledMethodFormat - 1) "indexable byte fields"
  	and: [(size between: 1 and: 64)
  	and: [Scanner isLiteralSymbol: (string := (0 to: size - 1) collect: [:i| Character value: (self fetchByte: i ofObject: address)])]]) ifTrue:
  		[^'#', (ByteString withAll: string)].
  	class := self fetchClassOfNonImm: address.
  	(class isNil or: [class = nilObj]) ifTrue:
  		[^nil].
  	"address is either a class or a metaclass, or an instance of a class or invalid.  determine which."
  	classNameIndex := coInterpreter classNameIndex.
  	thisClassIndex := coInterpreter thisClassIndex.
  	((classSize := self numSlotsOf: class) <= (classNameIndex max: thisClassIndex)
  	 or: [classSize > 255]) ifTrue:
  		[^nil].
  	"Address could be a class or a metaclass"
  	(fmt = 1 and: [size >= classNameIndex]) ifTrue:
  		["Is address a class? If so class's thisClass is address."
  		 (self lookupAddress: (self fetchPointer: classNameIndex ofObject: address)) ifNotNil:
  			[:maybeClassName|
  			(self fetchPointer: thisClassIndex ofObject: class) = address ifTrue:
  				[^maybeClassName allButFirst]].
  		"Is address a Metaclass?  If so class's name is Metaclass and address's thisClass holds the class name"
  		((self isBytes: (self fetchPointer: classNameIndex ofObject: class))
  		 and: [(self lookupAddress: (self fetchPointer: classNameIndex ofObject: class)) = '#Metaclass'
  		 and: [size >= thisClassIndex]]) ifTrue:
  			[maybeThisClass := self fetchPointer: thisClassIndex ofObject: address.
  			(self lookupAddress: (self fetchPointer: classNameIndex ofObject: maybeThisClass)) ifNotNil:
  				[:maybeThisClassName| ^maybeThisClassName allButFirst, ' class']]].
  	^(self lookupAddress: (self fetchPointer: classNameIndex ofObject: class)) ifNotNil:
  		[:maybeClassName| 'a(n) ', maybeClassName allButFirst]!

Item was changed:
  ----- Method: SpurMemoryManager>>postBecomeScanClassTable (in category 'become implementation') -----
  postBecomeScanClassTable
  	"Scan the class table post-become (iff a pointer object or compiled method was becommed).
  	 Note that one-way become can cause duplications in the class table.
  	 When can these be eliminated?  We use the classtableBitmap to mark  classTable entries
  	 (not the classes themselves, since marking a class doesn't help in knowing if its index is used).
  	 On image load, and during incrememtal scan-mark and full GC, classIndices are marked.
  	 We can somehow avoid following classes from the classTable until after this mark phase."
+ 
+ 	self assert: self validClassTableRootPages.
+ 
  	(becomeEffectsFlags anyMask: BecamePointerObjectFlag+BecameCompiledMethodFlag) ifFalse: [^self].
  	
+ 	0 to: numClassTablePages - 1 do:
- 	0 to: (self numSlotsOf: classTableRootObj) - 1 do:
  		[:i| | page |
+ 		page := self fetchPointer: i ofObject: hiddenRootsObj.
- 		page := self fetchPointer: i ofObject: classTableRootObj.
  		0 to: (self numSlotsOf: page) - 1 do:
  			[:j| | classOrNil |
  			classOrNil := self fetchPointer: j ofObject: page.
  			classOrNil ~= nilObj ifTrue:
  				[(self isForwarded: classOrNil) ifTrue:
  					[classOrNil := self followForwarded: classOrNil.
  					 self storePointer: j ofObject: page withValue: classOrNil].
  				 self scanClassPostBecome: classOrNil effects: becomeEffectsFlags]]]!

Item was changed:
  ----- Method: SpurMemoryManager>>reInitializeClassTablePostLoad: (in category 'class table') -----
+ reInitializeClassTablePostLoad: hiddenRoots
+ 	self hiddenRootsObj: hiddenRoots.
- reInitializeClassTablePostLoad: classTableRoot
- 	self classTableRootObj: classTableRoot.
  	self expungeDuplicateClasses!

Item was changed:
+ ----- Method: SpurMemoryManager>>scavenger (in category 'spur bootstrap') -----
- ----- Method: SpurMemoryManager>>scavenger (in category 'debug support') -----
  scavenger
  	<doNotGenerate>
  	^scavenger!

Item was changed:
+ ----- Method: SpurMemoryManager>>setCheckForLeaks: (in category 'spur bootstrap') -----
- ----- Method: SpurMemoryManager>>setCheckForLeaks: (in category 'debug support') -----
  setCheckForLeaks: anInteger
  	" 0 = do nothing.
  	  1 = check for leaks on fullGC.
  	  2 = check for leaks on scavenger.
  	  4 = check for leaks on become
  	  8 = check for leaks on truly incremental.
  	15 = check for leaks on all four."
  	checkForLeaks := anInteger!

Item was added:
+ ----- Method: SpurMemoryManager>>validClassTableRootPages (in category 'class table') -----
+ validClassTableRootPages
+ 	"Answer if numClassTablePages is correct."
+ 
+ 	"is it in range?"
+ 	(numClassTablePages > 1 and: [numClassTablePages <= self classTableRootSlots]) ifFalse:
+ 		[^false].
+ 	"are all pages the right size?"
+ 	0 to: numClassTablePages - 1 do:
+ 		[:i| | obj |
+ 		 obj := self fetchPointer: i ofObject: hiddenRootsObj.
+ 		 ((self addressCouldBeObj: obj)
+ 		  and: [(self numSlotsOf: obj) = self classTablePageSize]) ifFalse:
+ 			[^false]].
+ 	"are all entries beyond numClassTablePages nil?"
+ 	numClassTablePages to: self classTableRootSlots - 1 do:
+ 		[:i|
+ 		(self fetchPointer: i ofObject: hiddenRootsObj) ~= nilObj ifTrue:
+ 			[^false]].
+ 	^true!

Item was changed:
  ----- Method: SpurSegmentManager>>addSegmentOfSize: (in category 'growing/shrinking memory') -----
  addSegmentOfSize: ammount
  	<returnTypeC: #'SpurSegmentInfo *'>
  	| allocatedSize |
  	<var: #newSeg type: #'SpurSegmentInfo *'>
  	(manager "sent to the manager so that the simulator can increase memory to simulate a new segment"
  			sqAllocateMemorySegmentOfSize: ammount
+ 			Above: (segments at: 0) segStart + (segments at: 0) segSize
- 			Above: manager newSpaceLimit
  			AllocatedSizeInto: (self cCode: [self addressOf: allocatedSize]
  									inSmalltalk: [[:sz| allocatedSize := sz]])) ifNotNil:
  		[:segAddress| | newSegIndex newSeg |
  		 newSegIndex := self insertSegmentFor: segAddress.
  		 newSeg := self addressOf: (segments at: newSegIndex).
  		 newSeg
  			segStart: segAddress;
  			segSize: allocatedSize.
  		 self bridgeFrom: (self addressOf: (segments at: newSegIndex - 1)) to: newSeg.
  		 self bridgeFrom: newSeg to: (newSegIndex = (numSegments - 1) ifFalse:
  										[self addressOf: (segments at: newSegIndex + 1)]).
  		 "and add the new free chunk to the free list; done here
  		  instead of in assimilateNewSegment: for the assert"
  		 manager addFreeChunkWithBytes: allocatedSize - manager bridgeSize at: newSeg segStart.
  		 self assert: (manager addressAfter: (manager objectStartingAt: newSeg segStart))
  					= (newSeg segStart + newSeg segSize - manager bridgeSize).
  		 ^newSeg].
  	^nil!



More information about the Vm-dev mailing list