- My number one desire, which is not on your list, is to stop talking
about "the" map. At the very least, there should be a map per version of Squeak, because in practice packages in Squeak 3.6 are not going to work in Squeak 3.0 unless you make at least a few changes.
I think there are/will-continue-to-be plenty of packages that would work across multiple versions of Squeak. Any "independent" package that does not depend highly on the volatile-guts of Squeak.. a Chess engine, for example.
I didn't say you couldn't post a package to multiple maps. I expect that it will be less common than you seem to think -- for example, the changes to #new caused a *lot* of legacy code to break -- but whatever the ration ends up being, there are definitely packages already that want different versions in different versions of Squeak.
I have deleted 50-odd lines of commentary about this; if you really disagree that we do frequently want different versions of a package in different versions of Squeak, I'll edit it and post it for your consideration. Assuming that we agree that we sometimes want to have different versions of the packages for different versions of Squeak, we must come up with some way to implement it. It seems to me that having separate maps results in a more pleasant system than what we have now, where people split their packages manually. And there are other benefits in addition to a convenient way to have multiple versions per package, as I have described in other posts.
I don't have anything against having another map, but I do like being able to go to ONE map for everything I need, no matter what version of Squeak I need it for.
Your goal and your strategy are inconsistent. Today, we have a single map, and yet you cannot simply go to SqueakMap and see what packages work with the version of Squeak you are currently running. There are version tags there, but the tags are incorrect. Keeping the tags up to date is a real nuisance! Imagine a Squeak user of today loading Squeak 3.4 and then opening the Package Loader. How many of the packages they see are going to actually work when they install them?
On the other hand, multiple maps make your goal easy. The equivalent of the version tags would be kept up to date in the normal course of operation. Consider. Before you post a new version of a package, you will do at least some minimal amount of testing in each image, such as loading the program and seeing that it starts okay. Once you are done testing in each image, you'll open the Package Poster and press the "POST IT" button. Which map do you think it will post it to...? It will post it to the map that corresponds to the image you just tested in, surely.
Now picture what other users are going to see when they open up the Package Browser in any particular image.... they are going to see a map containing the most recent version of each package that has been tested in the image version they are using. If you did your testing in 3.6 and they are in 3.4, then they will simply and automatically see the last version you *did* test in 3.4. It all works so simply and smoothly that I daresay people won't think much about it at all, and they'd wonder what all the fuss was about if they were to read these threads!
I think it depends on the person. Some people almost-always just want the latest-known "working configuration," even if that means they do not have the latest-and-greatest of every particular required package. For example, if you have a dedicated computer to perform a menial chore and you don't need or care about having the latest, you just want the computer to do the work reliably (i.e., capture weather data, for example).
In this case, it's useful to be able to do a one-click install and KNOW that the entire configuration will work without having to debug or wonder if there's some change in one of the newer-versions of one of the sub-packages that is going to render it buggy.
In the extreme case, you won't update that computer at all, in which case all of this discussion is moot.
If you do occasionally update it, however, you will typically want to grab all of the bug fix updates that are out there, while avoiding any updates that add new features. Multiple maps solves this beautifully. Simply have a different updating policy for each map. A map for Squeak 3.6 would only allow updates that have been thoroughly tested and which do no more than fix bugs. End of story.
With single maps, things are more complicated....
By the way, you used the word "know" very quickly. Not only is a lack of errors impossible to accomplish, but people who want to try hard to achieve it are surely not going to want to rely only on random testers from out on the Internet. If you really want a super-stable Squeak image, then you are going to have to test it yourself, in a specific configuration. Even if someone out on the net tested that exact same set of package versions, you still would not want to automatically use that configuration. You would want to test it yourself.
Also by the way, the approach I described is exactly what Debian does, and it seems to work well. Debian is an excellent choice of Linux distribution for high-reliability situations, precisely because it does have separate streams of packages for "stable" and "unstable" systems.
-Lex
lex@cc.gatech.edu wrote:
Also by the way, the approach I described is exactly what Debian does, and it seems to work well. Debian is an excellent choice of Linux distribution for high-reliability situations, precisely because it does have separate streams of packages for "stable" and "unstable" systems.
Lex,
I'm running out the door so no time for a detailed reply, but why does it matter whether the "separate streams" are implemented as multiple physical repositories or one shared repository that allows multiple versions of one package to exist, each flagged appropriately for which squeak versions they work in?
As I skimmed your ideal user interactions, I saw nothing about them that couldn't be implemented identically with either solution. The only real difference as I see it, is that with a single repository you can avoid duplicating the information that *is* applicable at a higher level (like to the package as a whole).
Julian
Guys, I never disagreed that it is possible to get along with a single map. However, multiple maps seem to simplify some things and to enable some other things, all with no additional code beyond supporting multiple maps. With multiple maps, you immediately gain the following abilities:
1. Keeping track of which package releases are in which Squeak versions, with a UI that requires no extra activities by the users. 2. The ability for users to set up their own package universes without needing to coordinate withany central authority. 3. The ability to create nested universes and to have packages in inner universes automatically propagate to universes which contain them. 4. The ability for a package to be maintained by different people in different package universes, e.g. if I only support Chuck in 3.7, Joe can volunteer to support it in 3.0.
I find this list striking, when one considers that all of these features happen automatically and with no coding at all.
Now, perhaps I am misunderstanding the purpose of SqueakMap. If SqueakMap wants to be a catalog of everything, then I have misunderstood and that is fine. In that case, however, I do suggest that we start work on a new tool which is a "package universe browser". Users should have some tool that lets them select packages from a menu, install them, and have it just work. They should be able to do this even if they aren't following the main development version, and they should be able to integrate their own packages into the system without needing cooperation from a central authority. If SqueakMap is not intended for these purposes, then we should put together something else.
Regarding versioned dependencies, Julian, Goran did indeed call the configurations "dependencies". And please consider that in the A,B,C example there was *never* a time that I could upgrade *any* package. If this is typical, then the configurations will always be ignored by the user, which makes them pointless. If we want a dependency system that the user will practically never want to override--and I think we can get there--then it seems that we should not put fixed version numbers in the dependencies.
Overall, I have followed communites where the packaging system causes its users headaches (RedHat, Windows DLL's), and I have seen one community where the packaging system Just Works (Debian). I have thought about this issue a good bit, and I am simply pointing out some conclusions that I expect will make our system work better. Squeak is a decentralized playground, though, and everyone is free to play in it however they like.
-Lex
Hi Lex,
There are two seperate issues here.
1. one map vs. multiple maps 2. versioned vs. unversioned "dependencies/configurations"
---------------
Regarding the first:
- I agree with you on the many of the benefits of having multiple maps and certainly don't see why we shouldn't be able to have them. And - I disagree, however, that having multiple maps makes it particularly easier to support packages for multiple squeak versions - I also think it's important to realize that there are trade-offs. You list all the things you get without any coding by having multiple maps. But there things you can no longer get without coding, like having one package release that does work in multiple images and needs to be released into all the maps at once, having maintainer data kept up to date in all the maps, easily trying the version that you know works in 3.7 in 3.6 to see if it works there.
Summary: if we want multiple maps so people can run their own ones, etc. then fine--there's no reason we shoudl hardcode the URLs or avoid a UI to let you have more than one map. But I see no compelling reason to use a unique map for each squeak version.
--------------------
Regarding the second:
- I never said Goran *never* called them dependencies. All I meant is that when he was explaining them to me he called them configurations. Why are we still arguing about this? We seem to all agree that storing the versions that work is a good idea and that forcing the user to only be able to load that version is a bad idea. - I use Debian myself and I agree that it works great, *most of the time*. I doesn't *always* "just work", however. - Usually when it doesn't work (in my experience) it is because a package hasn't been updated yet to work with changes made in a new version of one of its dependencies. The newest versions of packages do not always work together.
Finally, I still don't understand why you think that you can't upgrade A,B, or C. As I worked through the scenario I could upgrade anything I wanted. The only time you can't upgrade something is if two different packages depend on features that are incompatible between two different versions. And that's just a fact. In all other cases, the knowledge of:
a) which versions work together, b) which versions don't work together, and c) the level of compatibility from one version of a package to the next
will help the system build or guide the user through building a working collection of packages.
It seems far more accurate to me to record: - A1 requires B - A1 works with B2 - A1 does not work with B1 - B3 includes on minor, compatible changes from B2 than to record (as debian does, I think): - A1 requires B >= 2 or (as you suggest): - A requires B
We can implement the behaviour of either of the other two lists from the first. The first contains more information. The information is all easily obtained. The first list is the way users who are testing or using the packages think about it anyway.
I feel like I've exhausted my arguments here so I may wind down my participation in the thread at this point... :)
Cheers,
Julian
lex@cc.gatech.edu wrote:
Guys, I never disagreed that it is possible to get along with a single map. However, multiple maps seem to simplify some things and to enable some other things, all with no additional code beyond supporting multiple maps. With multiple maps, you immediately gain the following abilities:
- Keeping track of which package releases are in which Squeak
versions, with a UI that requires no extra activities by the users.
- The ability for users to set up their own package universes without
needing to coordinate withany central authority.
- The ability to create nested universes and to have packages in inner
universes automatically propagate to universes which contain them.
- The ability for a package to be maintained by different people in
different package universes, e.g. if I only support Chuck in 3.7, Joe can volunteer to support it in 3.0.
I find this list striking, when one considers that all of these features happen automatically and with no coding at all.
Now, perhaps I am misunderstanding the purpose of SqueakMap. If SqueakMap wants to be a catalog of everything, then I have misunderstood and that is fine. In that case, however, I do suggest that we start work on a new tool which is a "package universe browser". Users should have some tool that lets them select packages from a menu, install them, and have it just work. They should be able to do this even if they aren't following the main development version, and they should be able to integrate their own packages into the system without needing cooperation from a central authority. If SqueakMap is not intended for these purposes, then we should put together something else.
Regarding versioned dependencies, Julian, Goran did indeed call the configurations "dependencies". And please consider that in the A,B,C example there was *never* a time that I could upgrade *any* package. If this is typical, then the configurations will always be ignored by the user, which makes them pointless. If we want a dependency system that the user will practically never want to override--and I think we can get there--then it seems that we should not put fixed version numbers in the dependencies.
Overall, I have followed communites where the packaging system causes its users headaches (RedHat, Windows DLL's), and I have seen one community where the packaging system Just Works (Debian). I have thought about this issue a good bit, and I am simply pointing out some conclusions that I expect will make our system work better. Squeak is a decentralized playground, though, and everyone is free to play in it however they like.
-Lex
Hi all!
lex@cc.gatech.edu wrote:
Guys, I never disagreed that it is possible to get along with a single map. However, multiple maps seem to simplify some things and to enable some other things, all with no additional code beyond supporting multiple maps.
As Julian noted you disregard all the problems we will have to solve with multiple maps.
With multiple maps, you immediately gain the following abilities:
- Keeping track of which package releases are in which Squeak
versions, with a UI that requires no extra activities by the users.
I can't understand what you mean. With multiple maps you will have to choose the "3.6" map in order to see the "3.6" packages, right? And with multiple maps I will need to select which maps to register a release in, depending on which Squeak versions it works in, right? Again, no difference compared to what we have now.
- The ability for users to set up their own package universes without
needing to coordinate withany central authority.
This is planned. The idea is to have a tree of map servers in which you add local additional information, thus you can have local packages only visible to you, or your company etc.
But this has NOTHING to do with which Squeak version things are meant for. It is purely a visibility thing - and the servers are still clearly connected with each other in a tree. But - note - this is HARD to get fully working properly and still make sure the model fits together properly. For example - the category tree should be the same for all servers etc.
In short:
1. The future SM will have a "single" map, but with multiple map servers structured in a tree thus giving us local additions and scoped visibility. 2. This is hard stuff to get working properly, unless we are going to throw away stuff we have today. 3. This has nothing to do with Squeak version or any other partitioning of the packages or their releases. It will ONLY affect visibility.
- The ability to create nested universes and to have packages in inner
universes automatically propagate to universes which contain them.
This is almost what I plan - but it sounds backwards. :) To be clear, this is my plan with a few imaginary servers thrown in:
Master server - this is the current map1.squeakfoundation.org. Squeak client - these are the leafs of the tree. They can be direct under the master, like they are today. Impara server - let's say Impara sets up a server of their own. Michael's Squeak - Michael instead connects to the Impara server, which knows it is a sub server of the master.
Now, all the things in SM are sub instances of SMObject. The idea is that an SMObject has a home map, it knows where in this tree it "belongs" and it is visible to all servers below it. Thus, a package that has the Master server as home map (like today) will be visible to all Squeakers in the world. An SMPackage or even SMAccount (!) with home map being Impara will only be visible below it - within Impara.
- The ability for a package to be maintained by different people in
different package universes, e.g. if I only support Chuck in 3.7, Joe can volunteer to support it in 3.0.
This already works today with co-maintainers.
I find this list striking, when one considers that all of these features happen automatically and with no coding at all.
That is not true. You are trying to make it sound as if having "multiple maps" (whatever that really means, have you looked at what SMSqueakMap contains? It contains SMCategories and SMAccounts - I would like to know how those will be handled in "multiple maps".) is a nobrainer and requires "no coding". This is just false.
Again, have you even looked at what SMSqueakMap contains today? Are you aware of SMAccounts and SMCategory? Are you aware that SMPackage knows its releases? And that the releases know which of the co-maintainers published it? (=a reference from SMPackageRelease to an instance of SMAccount) etc etc.
What I am saying is that the domain model is quite more complex than just "a bunch of package descriptions" and thus you CAN'T easily just split it up on a bunch of servers.
Now, perhaps I am misunderstanding the purpose of SqueakMap. If SqueakMap wants to be a catalog of everything, then I have misunderstood and that is fine. In that case, however, I do suggest that we start
Have you read the article I recently posted on SqP? It explains quite a bit about what I want SM to be. And yes, it was just recently posted so you may have not seen it yet.
work on a new tool which is a "package universe browser". Users should have some tool that lets them select packages from a menu, install them, and have it just work.
Hmmm, and what is the package loader you think?
They should be able to do this even if they aren't following the main development version, and they should be able to integrate their own packages into the system without needing cooperation from a central authority. If SqueakMap is not intended for these purposes, then we should put together something else.
I am not sure what you mean with "not following the main development version". The part with integrating their own packages though is definitely planned in SM - I explained it above. The reason for not moving in this direction until now is simply because it will be quite complex to get it all working smoothly. It was simply much easier to have a SINGLE master model on a server, especially when it has been evolving over time as it has.
SqueakMap IS intended for these purposes.
Regarding versioned dependencies, Julian, Goran did indeed call the configurations "dependencies". And please consider that in the A,B,C
I don't know when I did - it sounds "wrong" but I may have slipped my tongue sometime. Can you point me to it?
example there was *never* a time that I could upgrade *any* package. If
This sounds very wrong. Julian responded on this topic and it is probably purely useless for me to reiterate what he wrote.
this is typical, then the configurations will always be ignored by the user, which makes them pointless. If we want a dependency system that the user will practically never want to override--and I think we can get there--then it seems that we should not put fixed version numbers in the dependencies.
Lex, I can only say this:
The configuration model I have in mind is more about having users and maintainers "recording" working configurations so that other people, using an engine, can install packages and needed dependencies in an as easy way as possible. Given this approach two things are apparent: - Recording the exact information available. There is simply no POINT in throwing away info. - It must be very easy to both: - Record a working config - Deviate from the information so far collected.
I have tried to explain these ideas many times. I think it will work. It does NOT force the user in any way - in fact, it will be MUCH more friendly than say Debian, because the engine will be able to give advice based on the release compatibility levels. And thus people will not be afraid to try to use newer releases and they will thus discover and record if it works or not.
Now, enough typing. It doesn't feel productive. I get the feeling you are not listening to what I have to say and you are not even contemplating that it can be solved in a better and friendlier way than in Debian. Debian is nice, but it ain't perfect.
As I have said before - we use Debian on all our servers and it was the main inspiration for SqueakMap. But it is not the perfect solution - especially not for Squeak which is different in many ways.
Overall, I have followed communites where the packaging system causes its users headaches (RedHat, Windows DLL's), and I have seen one community where the packaging system Just Works (Debian). I have thought about this issue a good bit, and I am simply pointing out some conclusions that I expect will make our system work better. Squeak is a
Yes, you are pointing things out - but it doesn't seem that you have looked at how SqueakMap works.
decentralized playground, though, and everyone is free to play in it however they like.
Of course. But SM is such a central piece to the community, that I want as many as possible to like the way I am pushing it. I can't of course convince everyone - but just so you know - this is the main reason I am discussing this so much on this list.
-Lex
regards, Göran
Hey Goran, I truly have been reading what you and others post. For example, I showed that Julian's "problems" are actually easier to solve with multiple maps, and no one responded. Also, I truly am familiar with the details of SqueakMap that you are describing, though I have not delved into the details of its implementation. Please do take what I say seriously; I am not just covering my ears and harping my favorite tune.
There are two main desiderata I mentioned that you say you are not planning to implement: simple unversioned dependencies, and allowing clients to pull from multiple maps.
On dependencies, I would love if you were to actually step through either the example I posted or one of your own. Any example will do so long as it is not in lockstep and so long as you do not require interaction with the user to make the decisions. What do you envision the package loader will do at each step?
http://lists.squeakfoundation.org/pipermail/squeak-dev/2004-July/080076 .html
In the multiple maps question, I can boil the issues down to two scenarios.
Scenario 1: Someone wants to provide a mixin server that can be used to provide packages to multiple package universes. How do you support this, while allowing end users to use several different mixin servers in the same image?
Scenario 2: Your local organization makes its own SqueakMap server, with potentially its own user accounts, packages, and policies about package updates. The local server wants to be an extension of Squeak 3.7final. Can the organization do this without *any* interaction with a central Squeak authority?
Both 1 and 2 are trivial if the client pulls from a list of maps instead of just one map.
Scenario 2b: What if HP and Compaq each have a local SqueakMap server, and then HP acquires Compaq and they want to merge the server contents together? There is some inevitable pain from this, but how can we minimize it? Multiple maps help on 2b, though there are more issues to consider as well.
These said, here are some individual comments:
With multiple maps, you immediately gain the following abilities:
- Keeping track of which package releases are in which Squeak
versions, with a UI that requires no extra activities by the users.
I can't understand what you mean. With multiple maps you will have to choose the "3.6" map in order to see the "3.6" packages, right? And with multiple maps I will need to select which maps to register a release in, depending on which Squeak versions it works in, right? Again, no difference compared to what we have now.
You have the general idea, though are missing one important difference. With multiple maps, each image can *implicitly* specify a universe it is operating in via the list of maps it is pulling from. Then all the tools need no further interaction with the user. Both when downloading packages and when uploading packages and when uploading thumbs-up notifications, the tools will Just Know what to do.
Anyway, even if you disagree that the multiple maps approach is a better result, the really big point is that multiple maps gives it to you with no further code.
- The ability for users to set up their own package universes without
needing to coordinate withany central authority.
This is planned. The idea is to have a tree of map servers in which you add local additional information, thus you can have local packages only visible to you, or your company etc.
But this has NOTHING to do with which Squeak version things are meant for. It is purely a visibility thing - and the servers are still clearly connected with each other in a tree.
I don't understand your "nothing" comment, because visibility and versions-meant-for (I'd prefer universes-meant-for) are the same thing. But I'll procede on the parts I understand.
Trees of servers are very close to what I am talking about, but they do not give you the full functionality that multiple maps do. In particular, they do not support "mixin" servers such as the ones in Debian for DVD software and updated Java plugins, etc. There are hundreds of these mixins nowadays, so it is a valuable feature to support. To support mixin servers, though, you must allow the nodes of the trees to have multiple parent nodes. And once you do that, you are doing exactly what I suggest, only at the server level instead of in the clients.
- This is hard stuff to get working properly, unless we are going to
throw away stuff we have today.
I agree that difficulty of implementation is a important. However, it needs to be compared with the difficulty of implementing alternatives. Also, isn't it bothersome if a union operation is difficult to implement?
- The ability for a package to be maintained by different people in
different package universes, e.g. if I only support Chuck in 3.7, Joe can volunteer to support it in 3.0.
This already works today with co-maintainers.
It mostly works, though it currently requires central coordination. What if it's Jose wanting to support the Spanish version of Chuck? Jose should not need permission from me, and he should be allowed to call the package "Chuck" on the Spanish SqueakMap. More on this below.
I find this list striking, when one considers that all of these features happen automatically and with no coding at all.
That is not true.
I meant, no coding beyond the coding necessary to get multiple maps. Do you disagree that multiple maps would give you the things I describe, with no further coding?
Now, perhaps I am misunderstanding the purpose of SqueakMap. If SqueakMap wants to be a catalog of everything, then I have misunderstood and that is fine. In that case, however, I do suggest that we start
Have you read the article I recently posted on SqP? It explains quite a bit about what I want SM to be. And yes, it was just recently posted so you may have not seen it yet.
No, I haven't. Thanks for the pointer; I look forward to reading it.
work on a new tool which is a "package universe browser". Users should have some tool that lets them select packages from a menu, install them, and have it just work.
Hmmm, and what is the package loader you think?
It could be either one. Do you see what I mean between the difference of a "catalogue of everything" versus the toolset that would support package universes? SqueakMap clearly started life as the former, and it sounds like it is now trying to be both the former and the latter. It may simplify things to divide SqueakMap into two parts, one for each of these purposes.
Regarding versioned dependencies, Julian, Goran did indeed call the configurations "dependencies". And please consider that in the A,B,C
I don't know when I did - it sounds "wrong" but I may have slipped my tongue sometime. Can you point me to it?
I am glad that you are not calling these dependencies any longer. However, when I made my post earlier, you did halfway object to my request to have unversioned dependencies:
http://lists.squeakfoundation.org/pipermail/squeak-dev/2004-July/080026 .html
Also, here is a direct quote from a while back where you call it a "dependency conflict" that the configurations have not been solved happily:
"Hey, Mungo 1.3 has a dependency conflict with Pingu 1.2. They both rely on Zingo but Pingu has only been verified to work with Zingo 2.0. According to the compatibility level of Zingo 2.0 it is categorized as *compatible* (=level 4, changes have been made but the maintainer says he hasn't changed the API so it should work), would you like to use that instead?"
If they aren't dependencies then why do they give a "dependency conflict"?
The configuration model I have in mind is more about having users and maintainers "recording" working configurations so that other people, using an engine, can install packages and needed dependencies in an as easy way as possible.
Here you are calling them dependencies again....
It must be very easy to both:
Record a working config
Deviate from the information so far collected.
Sure. Please remove the "must", however. And by the way, it is interesting that you consider the second so important, when Debian is doing quite well without it.
example there was *never* a time that I could upgrade *any* package. If
This sounds very wrong. Julian responded on this topic and it is probably purely useless for me to reiterate what he wrote.
What would be useful would be for someone to make a concrete proposol of what should happen in examples like the one I described. No one has done that. They simply say the configurations should be ignored. Doesn't it beat the purpose, though, if all this great information is simply ignored by the tools?
I have tried to explain these ideas many times. I think it will work.
And I hope you succeed. This is a wonderful rallying center of our community.
Debian is nice, but it ain't perfect.
It is also proven to work, as I'm sure you agree.
decentralized playground, though, and everyone is free to play in it however they like.
Of course. But SM is such a central piece to the community, that I want as many as possible to like the way I am pushing it. I can't of course convince everyone - but just so you know - this is the main reason I am discussing this so much on this list.
One last thing. After reading this post and skimming your Squeak People article, I see that you are trying to centralize various aspects of SqueakMap. This seems like a good idea, but let us please be wary of building centralist policies into the architecture itself. Squeak is a loose community to begin with, and it seems overly ambitious to try and force people to organize more rigourously when they haven't decided to already.
Let me give some specific examples. Here are some things that should be doable at a local level, without involvement from any central authority:
1. The creation of new universes that draw packages from an existing one. The local guys should not have to coordinate with the server you are drawing from. 2. The creation and maintenance of user accounts. Individual organizations should be able to have their own accounts without publishing them publically. 3. The designation of who has permission to post package versions to each server. Particular package universes should have their own rules about this, that even the original owner of a package cannot necessarily override. (Even though in the main Squeak repository, we want to give more deference to the designated package owner.)
You may not be surprised at this point, but multiple maps gives you these decentralized policies automatically, and also, Debian manages its policies in a way that gives the above properties. While these properties are not strictly necessary, they are both desirable and attainable.
-lex
Lex,
lex@cc.gatech.edu wrote:
...
On dependencies, I would love if you were to actually step through either the example I posted or one of your own. Any example will do so long as it is not in lockstep and so long as you do not require interaction with the user to make the decisions. What do you envision the package loader will do at each step?
http://lists.squeakfoundation.org/pipermail/squeak-dev/2004-July/080076 .html
I've taken the freedom to use your example as a use case, see mail subjected: [DEPS][PAPER] Dependencies for Squeak in section Examples->To Lex's example .
Greetings Stephan
...
Hi Lex!
WARNING: This is a humongous posting. If you are not really into SM and dependencies, the question on SM architecture etc - it is probably DEAD BORING.
(ok, taking time to answer as much as I can, coding will just have to wait)
lex@cc.gatech.edu wrote:
Hey Goran, I truly have been reading what you and others post. For
Ok, then. It seemed to me that you hadn't.
example, I showed that Julian's "problems" are actually easier to solve with multiple maps, and no one responded. Also, I truly am familiar
Can you point me to that particular post? Or perhaps it was the one I responded to in this posting (the walk through).
with the details of SqueakMap that you are describing, though I have not delved into the details of its implementation. Please do take what I say seriously; I am not just covering my ears and harping my favorite tune.
I *am* trying to take it seriously.
There are two main desiderata I mentioned that you say you are not planning to implement: simple unversioned dependencies, and allowing clients to pull from multiple maps.
"to implement simple unversioned dependencies". Well, that depends on what you mean - I don't intend to *record* such dependencies *but* they are of course *implicit* since I store more information. (note: a little bit more on this below, where I talk about "intended" dependencies)
"Allowing clients to pull from multiple maps". It is not clear to me what that exactly means.
I have explained that I will move towards a tree (up side down, well that is my inner picture of it) structure of servers where the master is the root and where you can connect to a chosen server down the tree thus getting access to SMObjects not visible higher up (closer to the root).
Also, note that the map is a catalog - it is not a remote directory full of package files.
IMHO the mess that Debian suffers from because of the simple fact that they don't have a central catalog is not something we want. If information is meant for all of us - then it should easily be available to all of us - and I don't want to worry about if there is a package somewhere on some server that I don't know about.
On dependencies, I would love if you were to actually step through either the example I posted or one of your own. Any example will do so long as it is not in lockstep and so long as you do not require interaction with the user to make the decisions. What do you envision the package loader will do at each step?
http://lists.squeakfoundation.org/pipermail/squeak-dev/2004-July/080076.html
Sure. In fact I am replying to it all "in place" here: --------------------------
Individual authors are not in a position to make guarantees about releases, regardless of the technology. Arbitrary combinations of packages can interfere with each other. Additionally, there are no guarantees in software engineering to begin with. Tomorrow the sun will rise again and someone will find another bug in my code.
Eh? Guarantees? I wrote "promise" and what I of course mean is that the maintainer tells me that, sure "my package X0.09 works if you have Y1.2 and Z1.3" installed, because that is what he has tested.
Of course it isn't a "guarantee" - but it sure if much more precise than saying "my package X works if you have Y and Z".
So I have no idea where you are going with arguing about "guarantees" - we both know what I am saying.
If you really think there are guarantees to be made, though, let me give you two examples to consider:
- How do dependencies keep someone from changing the behavior of #new
to automatically call initialize and thus break your code? Does every package have a dependency on every part of the system kernel that it uses?
Of course not - don't be silly.
- How do you make a guarantee of correctness, when some other package
might modify the "open" menu in away that crashes the system whenever the "open" menu is used?
And if you *still* like "guarantees", how do you fit post-release bug reports into the world view? I guaranteed it yesterday, but now a new bug has been found; do I un-guarantee the package? What is the point of a guarantee that can be un-guaranteed later? Maybe I should just fix the bug and post an updated package and stop wrangling with dependencies. :)
You are rambling and if you keep arguing in this fashion then I will not bother discussing this. I haven't said "guarantee" - and even if I had used that word it was bloody obvious what I meant.
I as a maintainer can only say that "Yes, my package release 0.09 works as far as I - as the developer - can say, given that you have package releases Y1.2 and Z1.3 installed that it depends on. If you have older releases - I don't know, because I haven't tested those yet. If you have newer releases, I don't know - I especially don't know about *future* releases because I have no idea what will change in Z or Y tomorrow".
Btw, you did stumble upon a slightly interesting fact though - you said "how do you fit post-release bug reports into the world view?". Given that in the planned SM the dependency information is not embedded in the actual release (as in Debian) this means that the dependency information can be revised post-release. And it should, especially by adding more working know configurations. If the maintainer or anyone else discovers that, darn - there was actually a serious bug in Z1.2 and Z1.3 is really needed - then the configuration can and SHOULD be changed.
And if someone discovers that, hey "I used Z2.0 and it seems to work fine with that too!" then that should/could be added too. That is one of the key points of NOT embedding dependency information inside the releases.
Aside from the issues with the "guarantees" idea, it has not been established that the jigsaw puzzle of version-specific dependencies is going to be practical for tools to work with. This is a significant technical problem that versioned dependencies add, and everyone seem to be waving it off as something to handle in the future. I don't know that this is solvable, guys, and no one has made a convincing argument otherwise.
Well, I haven't built this yet - so how can I be sure. But I have thought about it and I have discussed it with many people. Now I want to give it a shot and we can see. If I fail - so what?
Then you can gloat and laugh and make a fool out of me as much as you like and then take over maintaining SM. ;)
In addition to those two problems, there are definitely times when the
Noting that the first "problem" was... well, what was it? Something about me using the word promise. And the second was that... we don't know if it will work until we have written it? Sure. Like all software.
dependency jigsaw cannot be adequately solved. Let me run you through a small example. Packages A and B depend on Collections:
A 1.0 needs Collections 1.0 B 1.0 needs Collections 1.0
Fine so far, I install all three packages. Now the collections library gets updated to 1.1. I cannot install the upgrade without breaking the dependencies of A and B!! So, I wait before installing it, even though
Not entirely true. First of all - the engine should be parameterizable (cool word) by you.
So the idea is that if you - at this point - select Collections 1.1 and press "install" it could say something like "The installed package releases A1.0 and B1.0 are only known to work with Collections1.0, but since Collections1.1 is marked by the maintainer as being 'Code changes, but only big fixes' they could still likely work, do you still want to proceed and install it?"
And if you set a parameter, let me fantasize here - "Allowed compatibility threshold" or something - then perhaps it wouldn't even ask if the new release was below a certain compatibility level.
Now, it doesn't end there. Not only can you install Collections1.1 (and as I described in a much more informed way than if the recorded dependencies were just "A and B needs Collections") but if someone ELSE has already tried this, and discovered A annd B works just FINE (who knows, perhaps there is even a test suite to run!) then he/she can have beaten the maintainers of A and B respectively and attached new configurations to A and B that says that, according to him - A1.0 works just fine with Collections1.1 (and the same for B).
So if you then trust this person, you can go ahead with even more information to guide you - he said it worked, and since it was Dan Ingalls - you decide to trust him.
And my third final point - if there was no Ingalls-config available, and you did have to "take a chance" - then YOU can be the good citizen paving the way for others to follow by verifying it works and attach a new configuration to A and B.
Now - IMHO this is a GREAT model :). I am not saying we will not discover details to tweak with it - for example, I can surely see it extended with information about "intended dependencies" - meaning that the maintainer might want to tell people what packages he wants to depend on, this may not always be the same thing.
I might be the main developer of package C, which also uses Collections. Okay, so eventually tthe author of A, being a great citizen, upgrades their Collection package and reruns their tests -- shock, nothing brock. While they are at it, they add a few class comments, and then post A 1.1 which depends on Collections 1.1. Drats, I still cannot update my Collections library, because that will make me uninstall B.
No big difference, at this point - select Collections 1.1 and press "install" it could say something (quite similar) like "The installed package release B1.0 is only known to work with Collections1.0. A1.0 on the other hand can be upgraded to A1.1 which is known to work with Collections.1.1. Since Collections1.1 is marked by the maintainer as being 'Code changes, but only bug fixes' B1.0 could still likely work, do you still want to proceed and install Collections1.1 and also A1.1?"
Of course, exactly how the engine will talk to the user is a UI issue. :) And upgrading A1.1 would of course be optional - the engine should only be HELPFUL - not forceful. You should always be TOTALLY FREE to install what the heck you like.
Note that I still cannot upgrade my Collections library, unless I uninstall B.
I can't really see where you have gotten this... "stiff bureacratic" view of the planned dependency mechanism? :) Sure, you can upgrade Collections at this point.
Finally, suppose Collections gets updated to 1.2, and finally B gets around to doing an upgrade. B 1.1 depends on Collections 1.2. Now what? I *still* cannot upgade B, because it depends on Collections 1.2, which is inconsistent with all available versions of A.
Ok, let's see where we are now:
Collections1.0 Collections1.1 Collections1.2 A1.0 -> Collections1.0 A1.1 -> Collections1.1 B1.0 -> Collections1.0 B1.1 -> Collections1.2
And you have Collections1.0, A1.0, B1.0 installed (the starting point ignoring the potential upgrades I have described above). Now, we have a few different ways to proceed. I will not describe all possible variations - but let's try going to Collections1.2.
At this point - select Collections 1.2 and press "install" it could say something like "The installed package releases A1.0 and B1.0 are not known to work with Collections1.2. B1.0 can be upgraded to B1.1 which is known to work with Collections.1.2. A1.0 can be upgraded to A1.1 which is known to work with the previous release of Collections - Collections1.1. Collections1.2 is marked by the maintainer as being 'Code changes, may break compatibility' A1.1 could still work, do you still want to proceed and install Collections1.2 and then also A1.1 and B1.1?"
I end up using A 1.0, B 1.0, and Collections 1.0. And I still cannot update the version of Collections that my C package works with, because I can't install the Collections update anyway. If I'm lucky, package A will release a version that works with 1.2 and I can upgrade everything.
As I have clearly shown this is not the case. I have NEVER argued for the dependency engine to be some kind of POLICE. You can still install *whatever you like*. The whole point is that you can do this and be AWARE of what it means and what the risks are.
And I have also, I hope, clearly shown that since we can all attach tested working configurations you will not be stuck at the whim of the maintainers. And I also hope i have shown that the configurations can be added AFTER the release, and even MODIFIED after the release. This is a GOOD THING - because if they are wrong they should simply be fixed.
But notice: int hat case, every single package in the example has had to *synchronize* their releases, in order to reach a state where the jigsaw has a solution. I could as well be unlucky, and A might declare itself compatible with Collections 1.3.....
Not commenting here since I hope I have shown that the "jigsaw" doesn't NEED to be solved in full. If you have a solved jigsaw - then fine. If you don't then fine too, at least you are aware of the different dependencies in your image that haven't been tested by others.
It appears that all the packages need to be upgraded in lockstep. I can't offer you a proof for the more general case :), but I imagine it will only get *worse* as more packages enter the picture. Is it
I don't think so. Of course, it all gets "worse" in some kind of way with lots of intertwined packages - but I don't think the model will suffer. There are some really nice things here helping it:
1. Users can help keep the dependencies tracked and verified, not only maintainers. This is cool. 2. Pressure can build on key packages that really needs to be upgraded in order to work properly.
This leads me to something I haven't talked about before - it should be possible to record an "anti configuration" - meaning that "No, sorry, I tested A1.1 with Collections1.2 and nope, didn't work". This would be invaluable to have. Again, we record information - that is all. Then how the information is used is purely up to you. I can even imagine different engines or at least clearly different pluggable strategies. :)
acceptible that every package needs to be tested with each incremental version of every other package, and thus achieve the lockstep progression? That seems very inefficient if most package updates are compatibility-preserving bug fixes.
I have described that the compatibility level categorization of releases will make this information explicit. If you don't want to test your releases at all - then fine, others will do it for you. :) And if you do test them, then why not record which releases you tested it against?
Finally, notice that users almost always want the new versions of all packages, regardless of what the upsteam author has actually tested with. Consider two cases:
I don't agree - but let me see what you are saying.
- They are drawing from a stable stream, e.g. 3.4. All package
updates are clear bugfixes, and further, all the testing that has happened has happened with the full set of packages loaded.
Eh... now you are talking about the update stream. That is only in part related I think.
- They are drawing from an unstable stream, e.g. 3.3. Nothing works
anyway, even if it has been "tested" by upstream, so you may as well
grab the newest stuff and hope for the best. Back off on occasion if something breaks.
I don't think these are the only "cases" at all. For example, I may be using 3.6 Basic with stable releases of KomHttpServer and HttpView (hah! yeah, right) for my deployed app - but also using the latest and hottest Monticello/Shout and Whisker because those are tools - and I want the latest and greatest of those.
In short - I think one should be able to mix and still know as much as possible about the risks and what has been tested or not.
Not sure what the cases were supposed to show either, that people always want the latest? Not convinced at all.
In short: dependencies on specific versions don't appear to help with their stated goal of reliability, they complicate the tools to an unclear degree, they appear to force development to be lockstep across all packages, and they actually *remove* the desirable functionality of upgrading all packages to their newest version.
Hehe, well - I sure hope that I have shown to at least some of the other readers of this thread that the above description of my plan is simply FUD.
To me, IMHO, they sure *does* appear to help, don't at all need to complicate the tools (that is of course a tool issue), don't at ALL force development to be lockstep, and doesn't remove anything at all.
I don't think this is a good bargain. There are already proven ways to get a reliable set of packages together, so let's use one of those instead.
IMHO I would say that SM has already proven itself in many, many ways. It works pretty fine, even though it has limitations.
I want to give this a shot - and I invite everyone to help me in doing this. And if we pull it off we will IMHO have a GREAT system that blows much everything else I have seen out of the water.
And it will also be very suitable for test driven development. Did you hear that Stephane? :) Because if we can attach or in some other way associate how to run the tests with the packages (included or not) then they will form a formal basis of testing out new configurations of releases.
Hell, it could even be automated in theory - we could have a robot server sitting trying to test new combinations of package releases and attaching those autotested configurations automatically.
Why would I want to throw that information away?
Sure, keeping the information is fine, and there are probably exciting things you can do with it. Just don't use it as dependencies.
Well, I intend to use it as the recorded information about what releases a specific release needs in order to function properly. :)
-------------------------- continuing this reply:
In the multiple maps question, I can boil the issues down to two scenarios.
Scenario 1: Someone wants to provide a mixin server that can be used to provide packages to multiple package universes. How do you support this, while allowing end users to use several different mixin servers in the same image?
"provide packages to multiple package universes"? I am sorry, but I don't understand exactly what that means.
Besides, the idea is for SM to be THE catalog for Squeak. It is NOT meant to just be a copy of how things work in Debian. SM is a catalog (sure, it has upload facility, but that is just an "addon") and NOT a package repository.
In SM you know how many packages there are exactly. And the map is coherent. In Debian you have a gazillion of unofficial repositories of package files, you need to find them, list them in your sources.list and then you still don't really trust the packages to work properly together - because they are after all picked from here and there. And these files don't share things - as the objects in SM does (SMAccounts, SMCategories etc). I have asked you repeatedly how you would handle that.
I will support the tree structure I talked about - but it will still be a coherent map, though with different visibility levels - but I am not sure if that satisfies your need here.
And I have no idea why "while allowing end users to use several different mixin servers" is such an important thing here. Yes, I know you like Debian and sources.list - but what is so cool about it really?
From a user standpoint it sucks. Why should I need to find and list
several sources when instead there could just be a full complete catalog that I can look in?
Decentralization is good if it fulfills a goal. In Squeak I would say the centralization of SM is a GOOD THING. We have a SINGLE map. And the multiple-servers-in-a-tree will fulfill important goals - I don't intend to add that just to make things more complex. But it will still maintain the sense that there is ONE map, although with the ability to have privately local added stuff.
Scenario 2: Your local organization makes its own SqueakMap server, with potentially its own user accounts, packages, and policies about package updates. The local server wants to be an extension of Squeak 3.7final. Can the organization do this without *any* interaction with a central Squeak authority?
When you say "makes its own" - do you mean set up a node in the tree I was talking about, or do you mean "implement their own code"? Just curious because I am not sure what you mean with "policies about package updates".
And again - the servers in the tree are NOT partitions based on Squeak version.
But if you give me some slack in interpretation here - yes, a company will be able to set up a SM server on its own and hook it into the tree by telling it which parent server it has (typically the master). Then that server can have SMObject instances which are only visible to servers below it (that includes the clients).
So the answer is yes, but do note that these accounts and packages will NOT be visible to others, it will be purely for the local organization. So if the company want things to be visible to the world - then why not register those objects in the master catalog? It is a *catalog* - not a repository.
Both 1 and 2 are trivial if the client pulls from a list of maps instead of just one map.
No it is NOT trivial. Either we give up a bunch of stuff we have today - and make SM just be a copy-of-Debian, and sure - *then* it may be easy (trivial is a strong word). But if we want to keep a lot of the things we have today then it is DEFINITELY NOT trivial.
Scenario 2b: What if HP and Compaq each have a local SqueakMap server, and then HP acquires Compaq and they want to merge the server contents together? There is some inevitable pain from this, but how can we minimize it? Multiple maps help on 2b, though there are more issues to consider as well.
Multiple maps help? First of all - this scenario is very uninteresting. I can't see why we should plan for that. Secondly - since SMObjects most likely will be movable "up" the tree (IMHO a very likely scenario - a private package going public) it should be very easy to put in a third server as parent of both - and then move the objects there.
But again, not very interesting to discuss.
These said, here are some individual comments:
With multiple maps, you immediately gain the following abilities:
- Keeping track of which package releases are in which Squeak
versions, with a UI that requires no extra activities by the users.
I can't understand what you mean. With multiple maps you will have to choose the "3.6" map in order to see the "3.6" packages, right? And with multiple maps I will need to select which maps to register a release in, depending on which Squeak versions it works in, right? Again, no difference compared to what we have now.
You have the general idea, though are missing one important difference. With multiple maps, each image can *implicitly* specify a universe it is operating in via the list of maps it is pulling from. Then all the tools need no further interaction with the user. Both when downloading packages and when uploading packages and when uploading thumbs-up notifications, the tools will Just Know what to do.
Again, this is just a tools issue. You can already tell the package loader to only show you packages/releases for say 3.6. There is no "further interaction".
I simply can't understand why it is beneficial to split the map (it is still a catalog - not a repository which you over and over seem to forget) over multiple servers and also doing so using a single partitioning - namely which Squeak version a package is for.
What about splitting it based on how stable the releases are? What about splitting it based on ANY other attribute? And then - realizing this - why splitting it at all? What is the gain?
The packages (all SMObjects, not just packages) are already categorizable using the category tree. We can slice them up in a whole range of different ways TODAY.
The mere physical location of the bits on the Internet is hardly interesting, is it? (again, remember that SM is a catalog, not a package repository).
Anyway, even if you disagree that the multiple maps approach is a better result, the really big point is that multiple maps gives it to you with no further code.
I do disagree, and I don't see what it gives me at all. :) I can already pick stuff based on Squeak version and a number of other criteria.
- The ability for users to set up their own package universes without
needing to coordinate withany central authority.
This is planned. The idea is to have a tree of map servers in which you add local additional information, thus you can have local packages only visible to you, or your company etc.
But this has NOTHING to do with which Squeak version things are meant for. It is purely a visibility thing - and the servers are still clearly connected with each other in a tree.
I don't understand your "nothing" comment, because visibility and versions-meant-for (I'd prefer universes-meant-for) are the same thing.
Eh, no it is definitely not the same thing. Maybe I wasn't clear. Visibility is the fact that you even know that there is a package named Foo. It is visible to you. It is thus not a purely private package only visible inside Bluefish. What Squeak versions it works in is a totally different thing.
In short - I am talking about public or private packages (and other SMObjects).
But I'll procede on the parts I understand.
Trees of servers are very close to what I am talking about, but they do not give you the full functionality that multiple maps do. In particular, they do not support "mixin" servers such as the ones in Debian for DVD software and updated Java plugins, etc. There are hundreds of these mixins nowadays, so it is a valuable feature to support. To support mixin servers, though, you must allow the nodes of the trees to have multiple parent nodes. And once you do that, you are doing exactly what I suggest, only at the server level instead of in the clients.
I need much more information on what such a mixin server is supposed to give us before commenting further. I emplore you to remember that SM is a CATALOG. Let me say that once more - A CATALOG. Like dmoz for example. Are there compelling reasons for us to split up a catalog over multiple machines? No, the only reason for doing that is to handle private information IMHO.
But to repeat - the server tree is purely meant for visibility. That is after all IMHO the only interesting "attribute" of the SMObjects to consider. If an SMObject isn't meant to be seen or even known about in public, there simply is no point in putting it on the master server. But if it is meant to be seen and known then it should be there.
Note also that the clients will use the same code as the servers - just as today. They are "local servers" so to speak.
- This is hard stuff to get working properly, unless we are going to
throw away stuff we have today.
I agree that difficulty of implementation is a important. However, it needs to be compared with the difficulty of implementing alternatives. Also, isn't it bothersome if a union operation is difficult to implement?
Eh, care to elaborate?
I am just saying that the current SM model, which is a rather complex graph of objects including SMAccount and SMCategory (with more to come), is NOT "trivially" split up over multiple machines. That ought to be pretty obvious to anyone. And when I even can't see the point in doing the split (except for the planned tree structure to handle private stuff) then why would I even contemplate doing it?
- The ability for a package to be maintained by different people in
different package universes, e.g. if I only support Chuck in 3.7, Joe can volunteer to support it in 3.0.
This already works today with co-maintainers.
It mostly works, though it currently requires central coordination. What if it's Jose wanting to support the Spanish version of Chuck? Jose should not need permission from me, and he should be allowed to call the package "Chuck" on the Spanish SqueakMap. More on this below.
Well, he can do it in different ways today (well, #3 in the future):
1. Take your "Chuck", and create his own package named "Spanish Chuck" and go for it. That is a separate package with a unique name. No "coordination" needed.
2. Join as a co-maintainer and make spanish releases of Chuck. Would need coordination in that you need to allow him as a co-maintainer - but that seems pretty reasonable to me.
3. Could in the future make unofficial releases of Chuck without your consent. They would be somehow linked from Chuck so that people can find and install them, but they would be owned by Jose.
Now, I don't want to allow non-unique package names. It will just confuse things.
If someone wants to set up a spanish SqueakMap with spanish packages using spanish descriptions - then I personally would think that the best thing is not to do that and instead make SM handle it. Let us then add categories for that. I would still be very interested in seeing those packages, and there are many people multilingual.
Just look at dmoz - it is a perfect example. It has lots of Swedish sites in it - and I can find them by looking in the correct category. Why would lots of small local dmozes gain except confusion and me not knowing if I have looked in all the right places?
I find this list striking, when one considers that all of these features happen automatically and with no coding at all.
That is not true.
I meant, no coding beyond the coding necessary to get multiple maps. Do you disagree that multiple maps would give you the things I describe, with no further coding?
I meant that the coding necessary to handle a distributed SM model over distinct separate servers that don't know about each other and still maintain non-redundancy and handle the complex object graph that SM has today would include a LOT of quite complex coding.
And then, to gain what? AFAICT we gained nothing. I can't possibly see how the mere distribution of bits onto multiple servers give us benefits. It only gives me headache.
It itches in my fingers again.... ok, can't help it: SM is a CATALOG.
Now, perhaps I am misunderstanding the purpose of SqueakMap. If SqueakMap wants to be a catalog of everything, then I have misunderstood and that is fine. In that case, however, I do suggest that we start
Have you read the article I recently posted on SqP? It explains quite a bit about what I want SM to be. And yes, it was just recently posted so you may have not seen it yet.
No, I haven't. Thanks for the pointer; I look forward to reading it.
work on a new tool which is a "package universe browser". Users should have some tool that lets them select packages from a menu, install them, and have it just work.
Hmmm, and what is the package loader you think?
It could be either one. Do you see what I mean between the difference of a "catalogue of everything" versus the toolset that would support package universes? SqueakMap clearly started life as the former, and it sounds like it is now trying to be both the former and the latter. It may simplify things to divide SqueakMap into two parts, one for each of these purposes.
I humbly disagree and instead think that SM, just like dmoz, should instead be extended to handle the various attributes we want to put on the SMObjects so that it can sliced, diced and peeled just the way we want to. I can't see any good reason (except for the visibility thing mentioned in the next paragraph) to split the SM catalog over multiple machines.
So the ONLY reason for adding the tree-structure I am talking about is to enable the ability to have "added local private SMObjects".
Regarding versioned dependencies, Julian, Goran did indeed call the configurations "dependencies". And please consider that in the A,B,C
I don't know when I did - it sounds "wrong" but I may have slipped my tongue sometime. Can you point me to it?
I am glad that you are not calling these dependencies any longer.
Calling what? I am totally lost. I have called them "configurations" for quite some time. A configuration is simply a tested "configuration" of a specific release in that it lists the dependencies in the form of the specific releases it works with.
However, when I made my post earlier, you did halfway object to my request to have unversioned dependencies:
http://lists.squeakfoundation.org/pipermail/squeak-dev/2004-July/080026.html
I am not sure what you are referring to. In that posting you asked for dependencies to be expressed between packages instead of package releases, and I said that I will take it into consideration. What I meant is that there may be some kind of relation that you want to express in that way - that was the thing I was blabbering about in "intended dependencies".
For example, "SharedStreams" has no intended dependencies at all - which means it should work fine in a vanilla image. If I mistakenly had made it dependent on some other package, then it would be nice to know this fact that no - there should not be any such dependencies.
But I am still robustly unshaken in my belief in the model of configurations that I have explained in painful detail in this posting. :)
Also, here is a direct quote from a while back where you call it a "dependency conflict" that the configurations have not been solved happily:
"Hey, Mungo 1.3 has a dependency conflict with Pingu 1.2. They both rely on Zingo but Pingu has only been verified to work with Zingo 2.0. According to the compatibility level of Zingo 2.0 it is categorized as *compatible* (=level 4, changes have been made but the maintainer says he hasn't changed the API so it should work), would you like to use that instead?"
If they aren't dependencies then why do they give a "dependency conflict"?
Hmmm? Ok, we are clearly not communicating. I said above that I don't think I have used the word "dependency" when I was in fact describing the concept I call "configuration".
An attached set of configurations to a specific release describes the dependencies it has - or rather - the set describes the tested known working "environments" for that release and thus I would say it does express the "dependencies" of that release.
The configuration model I have in mind is more about having users and maintainers "recording" working configurations so that other people, using an engine, can install packages and needed dependencies in an as easy way as possible.
Here you are calling them dependencies again....
No, here I was referring to the *actual package releases* needed to install in order to satisfy the needs of a set of selected packages/releases. Sure, perhaps sloppy wording. Perhaps I should have written:
"...so that other people, using an engine, can install package releases and the different package releases they need in order to work in an as easy way as possible."
Whatever.
It must be very easy to both:
Record a working config
Deviate from the information so far collected.
Sure. Please remove the "must", however. And by the way, it is
Why remove "must"? I really feel it MUST be easy. If it is hard, then it may very well fail.
interesting that you consider the second so important, when Debian is doing quite well without it.
Well, I don't consider Debian to be the super-duper solution - sure, it's good - but it is not perfect. Most people are aware that you can quite easily end up in problems with Debian too, especially if you try to mix things from testing into stable etc.
example there was *never* a time that I could upgrade *any* package. If
This sounds very wrong. Julian responded on this topic and it is probably purely useless for me to reiterate what he wrote.
What would be useful would be for someone to make a concrete proposol of what should happen in examples like the one I described. No one has done that. They simply say the configurations should be ignored. Doesn't it beat the purpose, though, if all this great information is simply ignored by the tools?
I hope I have explained it in a good way now.
I have tried to explain these ideas many times. I think it will work.
And I hope you succeed. This is a wonderful rallying center of our community.
Yes, it is. And I also want to reiterate that there are two major reasons IMHO that SM has worked so far:
1. It is a catalog, not a repository. It is format agnostic, and even if it may "slip" here and there it is truly meant to be the Good Cooperative Guy in the community.
2. #1 has led to the community accepting SM as the singular catalog for Squeak. This is important. A catalog like this is only Truly Really Good if it is a singularity.
Debian is nice, but it ain't perfect.
It is also proven to work, as I'm sure you agree.
As is SM, as I am sure *you* agree. :)
decentralized playground, though, and everyone is free to play in it however they like.
Of course. But SM is such a central piece to the community, that I want as many as possible to like the way I am pushing it. I can't of course convince everyone - but just so you know - this is the main reason I am discussing this so much on this list.
One last thing. After reading this post and skimming your Squeak People article, I see that you are trying to centralize various aspects of SqueakMap. This seems like a good idea, but let us please be wary of building centralist policies into the architecture itself. Squeak is a loose community to begin with, and it seems overly ambitious to try and force people to organize more rigourously when they haven't decided to already.
I am listening, it is a delicate balance of course.
Let me give some specific examples. Here are some things that should be doable at a local level, without involvement from any central authority:
- The creation of new universes that draw packages from an existing
one. The local guys should not have to coordinate with the server you are drawing from.
If I understand you correctly, that is planned with the tree-of-servers. You will for example be able to have 54 packages only visible inside Bluefish but they are in every other way "connected" to the global map. This is the part that is tricky of course.
- The creation and maintenance of user accounts. Individual
organizations should be able to have their own accounts without publishing them publically.
That is planned with the tree-of-servers.
- The designation of who has permission to post package versions to
each server. Particular package universes should have their own rules about this, that even the original owner of a package cannot necessarily override. (Even though in the main Squeak repository, we want to give more deference to the designated package owner.)
You may not be surprised at this point, but multiple maps gives you these decentralized policies automatically, and also, Debian manages its policies in a way that gives the above properties. While these properties are not strictly necessary, they are both desirable and attainable.
You know, if we consider the attribute "server of package X" - meaning which server it is placed on (you keep talking in terms of repositories but whatever) then we can EASILY in SM replace this attribute with a categorization.
Let's say I log into SM today and add the top category "Package universe" and then add... well, I am not sure - what partitioning we want that we don't have already... well, just to get my point through, I call them:
"Server 1" "Server 2" "Server 3"
Ok, now we can "put" the packages in one (or more) of these "universes" simply by adding any of these categories.
So, what did we gain? We already have a range of categorizations. Again, I can NOT understand why the mere fact that something is strewn around on separate machines gives us an advantage. It is just painful in terms of implementation
And I still note that you haven't explained how these multiple servers should synchronize the shared parts of the object model. I have asked repeatedly how SMAccount, SMCategory etc will work in a coordinated synchronized fashion. SM is a more and more complex object model - distributing it with read/write capabilities is NOT trivial, and I have repeatedly said so.
-lex
regards, Göran
PS. This posting took a considerable time to write, unless I find something really interesting in the coming replies I will not take the time to reply in such detail. I will reply though.
Hi Goran and Lex, Hi other Squeakers,
there are following some comments about how my DEPS model could work into the light of some of the thoughts from the BIG posting.
Note: in the following DEPS means the corresponding paper contained in the mail subjected [DEPS][PAPER] Dependencies for Squeak sent to the list recently.
goran.krampe@bluefish.se wrote:
Hi Lex! ...
lex@cc.gatech.edu wrote:
Hey Goran, ...
...
Btw, you did stumble upon a slightly interesting fact though - you said "how do you fit post-release bug reports into the world view?". Given that in the planned SM the dependency information is not embedded in the actual release (as in Debian) this means that the dependency information can be revised post-release. And it should, especially by adding more working know configurations. If the maintainer or anyone else discovers that, darn - there was actually a serious bug in Z1.2 and Z1.3 is really needed - then the configuration can and SHOULD be changed.
This could be modeled as updated Transformations in DEPS.
And if someone discovers that, hey "I used Z2.0 and it seems to work fine with that too!" then that should/could be added too. That is one of the key points of NOT embedding dependency information inside the releases.
This could also be modeled as changed Transformations: see section "Technical terms -> Caps -> Boolean logic".
Aside from the issues with the "guarantees" idea, it has not been established that the jigsaw puzzle of version-specific dependencies is going to be practical for tools to work with. This is a significant technical problem that versioned dependencies add, and everyone seem to be waving it off as something to handle in the future. I don't know that this is solvable, guys, and no one has made a convincing argument otherwise.
I see it more as a *social* problem, to have responsible maintainers trying to do their best. But if a maintainer wants his/her package to be in widespread use - which would give some happyness, at least for me -, then I think he/she is doing his/her best - to make it compatible with other packages other people are using, - to be careful with a potential 'stable' categorization, - to define its dependencies carefully.
And afterwards it is no problem to add conflicts...
Another idea: there also could be some rating system for packages, marking white and black sheeps!
...
dependency jigsaw cannot be adequately solved. Let me run you through a small example. Packages A and B depend on Collections:
A 1.0 needs Collections 1.0 B 1.0 needs Collections 1.0
Fine so far, I install all three packages. Now the collections library gets updated to 1.1. I cannot install the upgrade without breaking the dependencies of A and B!! So, I wait before installing it, even though
I have treaten this example (further parts are snipped here) in DEPS.
Not entirely true. First of all - the engine should be parameterizable (cool word) by you.
So the idea is that if you - at this point - select Collections 1.1 and press "install" it could say something like "The installed package releases A1.0 and B1.0 are only known to work with Collections1.0, but since Collections1.1 is marked by the maintainer as being 'Code changes, but only big fixes' they could still likely work, do you still want to proceed and install it?"
The proposed versioning number system in DEPS is just a model to express such marks like 'Code changes, but only big fixes' technically. Though I more have had *small* fixes in mind as an example for changing the 4. version number...
And if you set a parameter, let me fantasize here - "Allowed compatibility threshold" or something - then perhaps it wouldn't even ask if the new release was below a certain compatibility level.
Again I think it is important to mention the social aspects here: as better the package maintainers categorize their packages, as better such a technical solution could work, if there is only this thing. But there may be more to help the technical solution computing good results, e.g. prefiltering of packages after some criteria: - stable/unstable, - rating by users, - rating by automatically testing, - trust measure of package maintianer (somewhat delicate), - etc..
Now, it doesn't end there. Not only can you install Collections1.1 (and as I described in a much more informed way than if the recorded dependencies were just "A and B needs Collections") but if someone ELSE has already tried this, and discovered A annd B works just FINE (who knows, perhaps there is even a test suite to run!) then he/she can have beaten the maintainers of A and B respectively and attached new configurations to A and B that says that, according to him - A1.0 works just fine with Collections1.1 (and the same for B).
In DEPS to be modeled by Boolean logic in requires.
So if you then trust this person, you can go ahead with even more information to guide you - he said it worked, and since it was Dan Ingalls - you decide to trust him.
This is an interesting point: rating 'configurations' after the people making them. Note to 'configurations': in DEPS these would be expressed as Transformations expressing the dependencies, possibly enriched by a pointer to the respected person together with its *installed* packages to get the whole thing. In other words: who has used which Transformations to get its fancy system configuration.
And my third final point - if there was no Ingalls-config available, and you did have to "take a chance" - then YOU can be the good citizen paving the way for others to follow by verifying it works and attach a new configuration to A and B.
But its expressiveness is limited, if the knowledge about *all* the installed packages hasn't been recorded, too. YOU can have a configuration with very few packages totally incompatible with many others in widespread use, for example.
Now - IMHO this is a GREAT model :). I am not saying we will not discover details to tweak with it - for example, I can surely see it extended with information about "intended dependencies" - meaning that the maintainer might want to tell people what packages he wants to depend on, this may not always be the same thing.
In DEPS the "intended dependencies" would partly modeled by the compatibility level expressed by the version number given with the package. The other part would be Transformations for installing it, first given by the maintainer in addition to the package (there may be some process, too...). Later it should be possible to change the Transformations: normally they would be enriched with conflicts. I currently don't think it makes much sense to change a given version number without changing the package, too, but there may be exceptions (I think this will rarely be used in Debian for overriding the normal dependency mechanism).
I might be the main developer of package C, which also uses Collections. Okay, so eventually tthe author of A, being a great citizen, upgrades their Collection package and reruns their tests -- shock, nothing brock. While they are at it, they add a few class comments, and then post A 1.1 which depends on Collections 1.1. Drats, I still cannot update my Collections library, because that will make me uninstall B.
No big difference, at this point - select Collections 1.1 and press "install" it could say something (quite similar) like "The installed package release B1.0 is only known to work with Collections1.0. A1.0 on the other hand can be upgraded to A1.1 which is known to work with Collections.1.1. Since Collections1.1 is marked by the maintainer as being 'Code changes, but only bug fixes' B1.0 could still likely work, do you still want to proceed and install Collections1.1 and also A1.1?"
Of course, exactly how the engine will talk to the user is a UI issue. :) And upgrading A1.1 would of course be optional - the engine should only be HELPFUL - not forceful. You should always be TOTALLY FREE to install what the heck you like.
This is an interesting point: how to *override* the automatically chosen solution? In DEPS a conflict is a conflict: it makes no sense to override a known conflict. But what about a weak conflict? E.g. Foo_3.2.1.1.1 *may* break Bar under some not very likely circumstances.
Straightforward idea (after some moments of thinking): classifications of conflicts and using them together with the user install policy as a filter together with hints to the user.
More simple solution: not stating the conflict in the DEPS-Transformation, but writing a warning (possibly standardized, so that GUIs could issue it automatically) into the description of the package. Then the user has the choice...
...
As I have clearly shown this is not the case. I have NEVER argued for the dependency engine to be some kind of POLICE. You can still install *whatever you like*. The whole point is that you can do this and be AWARE of what it means and what the risks are.
Additional idea: policy ignoring conflicts in DEPS-Transformations as option to force an install.
And I have also, I hope, clearly shown that since we can all attach tested working configurations you will not be stuck at the whim of the maintainers.
And I also hope i have shown that the configurations can be added AFTER the release,
I think of introducing some whatever dependency system slowly: at first only a few packages get their technically usable version numbers. Then the first package is published with separately delivered DEPS-Transformations describing its dependencies to these few packages, so they can be automatically installed with it. This is so comfortable that more and more package maintainers jump onto the same train...
and even MODIFIED after the release.
This is a crucial point: a package should contain its expected compatibility level (via its version number), but not its dependencies.
This is a GOOD THING - because if they are wrong they should simply be fixed.
Agreed.
...
Of course, it all gets "worse" in some kind of way with lots of intertwined packages - but I don't think the model will suffer. There are some really nice things here helping it:
- Users can help keep the dependencies tracked and verified, not only
maintainers. This is cool.
Here again I'm seeing some social questions: who is responsible to change the 'official' dependencies which the non expert (or an expert not knowing the 1000th package) user of Squeak *normally* uses? Since he/she wants to have the one-click install...
- Pressure can build on key packages that really needs to be upgraded
in order to work properly.
This leads me to something I haven't talked about before - it should be possible to record an "anti configuration" - meaning that "No, sorry, I tested A1.1 with Collections1.2 and nope, didn't work". This would be invaluable to have. Again, we record information - that is all. Then how the information is used is purely up to you. I can even imagine different engines or at least clearly different pluggable strategies. :)
Recording the "anti configuration" makes sense.
But I think while writing DEPS I've had a more rigid policy in mind. Assumed the observation above is true: then it is a *conflict*, isn't it? Which *normal* user not trying to fix the bugs (say > 99% of all users) would ever like to install an "anti configuration"?
So this "anti configuration" should be expressed as a conflict to protect the normal user. For the few experts, which are not only experts, but also interested in this special broken "anti configuration" the override buttons should be there (but switched off in the official image).
acceptible that every package needs to be tested with each incremental version of every other package, and thus achieve the lockstep progression? That seems very inefficient if most package updates are compatibility-preserving bug fixes.
I have described that the compatibility level categorization of releases will make this information explicit.
If you don't want to test your releases at all - then fine, others will do it for you. :)
The maintainer has made at least a minimal test, if there has been just one run of the package...
To an automatically dependeny resolving system: I think such a bleeding edge package should just been outside automatically installation, since it would make more problems than it helps. A few of such packages and you can start with a fresh image... The good thing: no responsible maintainer of another package would put such a bleeding edge package in the requires of his/her package! So possibly this is already the automatical solution of this policy problem given without any costs...
And if you do test them, then why not record which releases you tested it against?
This certainly makes sense.
...
And it will also be very suitable for test driven development. Did you hear that Stephane? :) Because if we can attach or in some other way associate how to run the tests with the packages (included or not) then they will form a formal basis of testing out new configurations of releases.
Hell, it could even be automated in theory - we could have a robot server sitting trying to test new combinations of package releases and attaching those autotested configurations automatically.
This would be very nice!
To the questions of single or multiple tree or not tree'd maps -------------------------------------------------------------- A dependency system should work with distributed maps as well. There should be distributed dependencies then, too; in the tree case: a local server surely has a map containing some local packages (why to have it otherwise?), and therefrom it has at least some dependencies between packages to be expressed, which its master doesn't know, if a dependency system should be used for these local packages.
... <snipped very much here>
Greetings Stephan
Okay, here is a response to a few important issues in this thread.
1. Just to be clear, what I have called "multiple maps" is pretty much equivalent to local servers plus the ability for a server to have multiple parents. I'll call it "multiple parents" from now on, if I am talking about potential changes to SqueakMap.
2. SMCategories could be used to designate universes, but they are not used that way right now. It should be a big deal to put a package into a universe, but putting a package into "3.6" is currently just a simple click of the mouse in a web browser. If we want to implement universes with categories (not my prefered implementation), then we need to define which category goes with which universe, and we need to modify the UI to make it clear that teleporting a package from one universe into another is a big deal that requires careful thought. For some universes, it should even require permission to add things in.
3. No one seems to get my distinction between a catalog of everything, and a one-click no-hassle package installer, updater, and remover. SqueakMap, today, is a catalog of everything. You can expect to go to SqueakMap and find any package that exists. However, you can't really expect to click on an item in SqueakMap and have it install successfully along with any dependencies it needs. That's fine for a catalog of everything.
In a one-click installer, though, you should not even *see* stuff that cannot be installed. You should not have to answer questions about different versions of stuff, unless you have explicitly asked to get into the nitty gritty. Every package version you see in the tool, should be consistent with whatever universe you are operating in.
All of these requirements seem to add up to a different system. To begin with, it seems both necessary and useful to admit that universes exist and to have the tools talking about them. One way to do that is to associate different categories, maps, or catalogs with different universes, so that you choose your universe by choosing which categories/maps/catalogs to get packages from, but this kind of association will not happen unless we plan for it.
Anyway, if you want SqueakMap to be a reliable auto-installer, then you are pushing yourself to have to deal with the issues in this point at some point. On the other hand, if you are happy with it being a catalog of everything -- a humongously valuable service -- then maybe a lot of the issues in this post go away, because you can just tell people to use some other tool if that's what they want.
4. As a general rule of thumb, it is nice if the system lets any particular thing be decided in a decentralized way, even if we don't expect to need it immediately. No matter how benevolent a power is controlling the central server, there are likely to be some cases that someone legitimately wants different things locally. Let's late bind our policies. :)
That's the general decentralization mantra. Let's consider one particular case, to give you the idea: who decides what is allowed to go into the map? Several people have suggested that every package should have an entry in the "main" SqueakMap server. If you think about it, though, there are actually plenty of packages that would make sense to put into *a* SqueakMap, but not into *the* "main" SqueakMap server. For example: a. Home movies by one of my relatives. The main SM will probably not want to hold a terabyte of stuff that only a handful of people care about. b. Stuff that is illegal to post on a server in some countries. The central server will probably want to shy away from such things, but since such things include DVD-playing software, other people in the world might reasonably want to do it. c. Stuff *offensive* to large portions of the regular Squeak community. I won't try to draw an exact line, but to get your imagination going, SM should surely be child-friendly and religion-neutral.
d. Stuff that is private to one organization, e.g. Disney's software for a new ride at DisneyWorld. It would be extremely nice if people in the above categories can use the SqueakMap software without having to coordinate with any central Squeak authority, much less having to register their packages on the "main" SM server. They don't want their stuff public, and for the most part nobody else wants to see it, either.
This "what goes in?" question is just one example. For just about any property you can name for "the" SqueakMap, there is a legitimate reason for some people to want to do something different locally. We should strive not to *force* these decisions to be made centrally whenever we can avoid it. We should strive not to hamper our future organizational possibilities due to assumptions in the architecture.
5. Mixin repositories are very valuable, even though we don't need them immediately. Debian has hundreds of these, and it is only through the mixins mechanism that Debian users can use Squeak itself through the standard Debian tools. In the above listing, everything in a-c is likely to be prefered as a mixin; individual items matching those categories can be independent from each other, and different people will want different subsets of those servers to be visible in their universe.
Since mixins seem to be valuable, we should surely think carefully before we make any architectural assumption that would make them impossible. Is the single-parent assumption such an assumption? It appears to be.
6. Goran asks about my solution for merging. I look at class SMSqueakMap and I see no problem with simply concatenating the variables that come from the parents. I really don't see what the big deal is, because you would only try to merge SM's that are friendly with each other and mostly consistent. If the parent maps use different UUID's for the same category, then it's two different categories. If name conflicts are bothersome, then rename one of them. Etc.
At any rate, however hard it is to do a merge, it is only going to get harder as the model gets more complicated. If merging is valuable -- and it seems to be -- then the sooner it is addressed the less painful it will be.
The prototype Universes package has merging in it, for any who want to see its solution to merging. Look at class UCompoundUniverse.
7. You say that the versioned configurations will automatically contain unversioned dependency, but I don't see how that can be so. Here is an unversioned dependency:
Scamper depends on URL How can this be inferred from a big soup of packages like I have in my present image? It seems like the developer needs to give a hint at some point.
Along these lines, can anyone give a *specific* response to the A,B,C,Collections example, that uses versioned configurations? Goran's response either had the tool asking the user on the very first upgrade, which does not seem the right behavior for a no-hassle one-click installer, or it had some as-yet-undefined compatibility level. How does a compatibility level work, and is it really going to give something more reliable than simply "grab the most recent version" ?
8. What "mess" in Debian is Goran talking about? If SqueakMap worked as well as Debian's apt I'd be ecstatic, so clearly I must be missing something. It is surreal to mention the mixing of testing and unstable as a problem, because that is precisely what I am suggestiong that we *not* do in the auto-install tool. Testing and unstable are different universes in Debian, and they should not be mixed. In fact, the Debian tools do *not* mix them unless you go hack the files manually or unless you circumvent the main tools. SqueakMap, on the other hand, has everything mixed to begin with, which leads to people searching for ways to separate the universes back out.
-Lex
Hi Lex and all!
Trying to squeeze in an answer here...
lex@cc.gatech.edu wrote:
Okay, here is a response to a few important issues in this thread.
- Just to be clear, what I have called "multiple maps" is pretty much
equivalent to local servers plus the ability for a server to have multiple parents. I'll call it "multiple parents" from now on, if I am talking about potential changes to SqueakMap.
Yes, I am still "reflecting" on the various use cases and effects. :) We will see. It is currently on the "back burner" though, I want to incorporate a solution for dependencies first (including making it possible for Stephan to experiment with his model).
- SMCategories could be used to designate universes, but they are not
used that way right now. It should be a big deal to put a package into a universe, but putting a package into "3.6" is currently just a simple click of the mouse in a web browser. If we want to implement universes with categories (not my prefered implementation), then we need to define which category goes with which universe, and we need to modify the UI to make it clear that teleporting a package from one universe into another is a big deal that requires careful thought. For some universes, it should even require permission to add things in.
Sure. But all this is very doable. Adjustements to the UI, adding some mechanisms like perhaps some permissions etc.
- No one seems to get my distinction between a catalog of everything,
and a one-click no-hassle package installer, updater, and remover.
Well, I think see the distinction.
SqueakMap, today, is a catalog of everything. You can expect to go to SqueakMap and find any package that exists. However, you can't really expect to click on an item in SqueakMap and have it install successfully along with any dependencies it needs.
Of course not - because it doesn't *have* dependencies yet. Pretty obvious to me.
That's fine for a catalog of everything.
In a one-click installer, though, you should not even *see* stuff that cannot be installed.
This is just filtering, trivial IMHO.
You should not have to answer questions about different versions of stuff, unless you have explicitly asked to get into the nitty gritty. Every package version you see in the tool, should be consistent with whatever universe you are operating in.
Fine, again filtering and preferences.
All of these requirements seem to add up to a different system. To
I don't agree. :) I just see it as new use cases and requirements that we need to implement.
begin with, it seems both necessary and useful to admit that universes exist and to have the tools talking about them. One way to do that is to associate different categories, maps, or catalogs with different universes, so that you choose your universe by choosing which categories/maps/catalogs to get packages from, but this kind of association will not happen unless we plan for it.
My biggest problem with "multiple maps" (in a loose meaning) is that we very likely will loose the very important aspect of today - there is ONE place to go to. I don't want to chase around looking for "maps" etc. Why should I need to?
I want to solve the use cases and STILL maintain our very nice complete singular place that we can go to.
Anyway, if you want SqueakMap to be a reliable auto-installer, then you are pushing yourself to have to deal with the issues in this point at some point.
I am. All the time. Why do you think I am working on dependencies?!
On the other hand, if you are happy with it being a catalog of everything -- a humongously valuable service -- then maybe a lot of the issues in this post go away, because you can just tell people to use some other tool if that's what they want.
I want to solve as many use cases as possible. I have always wanted to do that.
- As a general rule of thumb, it is nice if the system lets any
particular thing be decided in a decentralized way, even if we don't expect to need it immediately. No matter how benevolent a power is controlling the central server, there are likely to be some cases that someone legitimately wants different things locally. Let's late bind our policies. :)
That's the general decentralization mantra. Let's consider one particular case, to give you the idea: who decides what is allowed to go into the map? Several people have suggested that every package should have an entry in the "main" SqueakMap server. If you think about it, though, there are actually plenty of packages that would make sense to put into *a* SqueakMap, but not into *the* "main" SqueakMap server. For example:
a. Home movies by one of my relatives. The main SM will probably not want to hold a terabyte of stuff that only a handful of people care about.
b. Stuff that is illegal to post on a server in some countries. The central server will probably want to shy away from such things, but since such things include DVD-playing software, other people in the world might reasonably want to do it.
c. Stuff *offensive* to large portions of the regular Squeak community. I won't try to draw an exact line, but to get your imagination going, SM should surely be child-friendly and religion-neutral.
d. Stuff that is private to one organization, e.g. Disney's software for a new ride at DisneyWorld.
It would be extremely nice if people in the above categories can use the SqueakMap software without having to coordinate with any central Squeak authority, much less having to register their packages on the "main" SM server. They don't want their stuff public, and for the most part nobody else wants to see it, either.
You know - I have always planned for this! That is the whole point with many of the use cases for the new architecture which I was planning as a tree of servers - and now am undecided on.
You make it sound like I am opposed to this - and I am NOT. Really.
This "what goes in?" question is just one example. For just about any property you can name for "the" SqueakMap, there is a legitimate reason for some people to want to do something different locally. We should strive not to *force* these decisions to be made centrally whenever we can avoid it. We should strive not to hamper our future organizational possibilities due to assumptions in the architecture.
- Mixin repositories are very valuable, even though we don't need them
immediately. Debian has hundreds of these, and it is only through the mixins mechanism that Debian users can use Squeak itself through the standard Debian tools. In the above listing, everything in a-c is likely to be prefered as a mixin; individual items matching those categories can be independent from each other, and different people will want different subsets of those servers to be visible in their universe.
Since mixins seem to be valuable, we should surely think carefully before we make any architectural assumption that would make them impossible. Is the single-parent assumption such an assumption? It appears to be.
Well, I am rethinking the model after discussions with Julian. I still don't want to end up in a world where people have to hunt around for "maps".
- Goran asks about my solution for merging. I look at class
SMSqueakMap and I see no problem with simply concatenating the variables that come from the parents. I really don't see what the big deal is, because you would only try to merge SM's that are friendly with each other and mostly consistent. If the parent maps use different UUID's for the same category, then it's two different categories. If name conflicts are bothersome, then rename one of them. Etc.
Well, you are ignoring several cross links in the model, but whatever I don't bother discussing that anymore...
At any rate, however hard it is to do a merge, it is only going to get harder as the model gets more complicated. If merging is valuable -- and it seems to be -- then the sooner it is addressed the less painful it will be.
Well, I have chosen to get the model a bit more evolved first - dependencies. Then I will go for the new arch.
The prototype Universes package has merging in it, for any who want to see its solution to merging. Look at class UCompoundUniverse.
- You say that the versioned configurations will automatically contain
unversioned dependency, but I don't see how that can be so. Here is an unversioned dependency:
Scamper depends on URL
How can this be inferred from a big soup of packages like I have in my present image? It seems like the developer needs to give a hint at some point.
Don't understand. But whatever. If Scamper-r1 depends on URL-r3 then I see the dependency, don't I?
Along these lines, can anyone give a *specific* response to the A,B,C,Collections example, that uses versioned configurations? Goran's response either had the tool asking the user on the very first upgrade, which does not seem the right behavior for a no-hassle one-click installer, or it had some as-yet-undefined compatibility level. How does a compatibility level work, and is it really going to give something more reliable than simply "grab the most recent version" ?
What?! I gave you a *specific* response exactly along my model. And the questions can easily be changed to auto actions based on preferences. The compatibility level IS ALREADY ON SM. Just create a new release and you will see. And I have explained TONS of times what it is.
Sigh. I gave my "specific response".
- What "mess" in Debian is Goran talking about? If SqueakMap worked as
well as Debian's apt I'd be ecstatic, so clearly I must be missing something. It is surreal to mention the mixing of testing and unstable
Well, you can't compare the two when SM HAS NO DEPENDENCIES YET.
as a problem, because that is precisely what I am suggestiong that we *not* do in the auto-install tool.
And *I* am suggesting that it will not be problematic to do it in MY model.
Testing and unstable are different universes in Debian, and they should not be mixed. In fact, the Debian
What do you mean should not be mixed? I think that is up to the user.
tools do *not* mix them unless you go hack the files manually or unless you circumvent the main tools. SqueakMap, on the other hand, has
That is wrong AFAIK. I used Galeon from unstable when running testing. Don't recall all the details on pinning etc, but mixing sure is doable.
everything mixed to begin with, which leads to people searching for ways to separate the universes back out.
-Lex
regards, Göran
On Jul 31, 2004, at 10:00 AM, goran.krampe@bluefish.se wrote:
SqueakMap, today, is a catalog of everything. You can expect to go to SqueakMap and find any package that exists. However, you can't really expect to click on an item in SqueakMap and have it install successfully along with any dependencies it needs.
Of course not - because it doesn't *have* dependencies yet. Pretty obvious to me.
That's fine for a catalog of everything.
In a one-click installer, though, you should not even *see* stuff that cannot be installed.
This is just filtering, trivial IMHO.
You should not have to answer questions about different versions of stuff, unless you have explicitly asked to get into the nitty gritty. Every package version you see in the tool, should be consistent with whatever universe you are operating in.
Fine, again filtering and preferences.
I think you're missing a key point here, or at least dismissing it as less important than I think it is. When you say "just filtering", it sounds like you're talking purely about UI: which packages (and package releases) are displayed to the user in the package loader. But what Lex is saying is deeper than that: first, that these "filters" will affect which versions of required packages are used to fulfill dependencies by default (so it doesn't just filter what the user sees, but in some sense what the dependency engine sees as well), and second, that because of this behavior you can get away with a much simpler dependency system - because it's the filters that are doing most of the narrowing down to a particular release, and so the dependency engine doesn't have to.
That's not a trivial thing, and it's not just a UI layer on top of the SM mode: it's a very different way to think about the dependency model, and I think you should take it into account. FWIW, it also aligns pretty well with the direction I've been meaning to take Monticello-level dependencies/tags...
Avi
Hi!
Avi Bryant avi@beta4.com wrote: [SNIP]
You should not have to answer questions about different versions of stuff, unless you have explicitly asked to get into the nitty gritty. Every package version you see in the tool, should be consistent with whatever universe you are operating in.
Fine, again filtering and preferences.
I think you're missing a key point here, or at least dismissing it as less important than I think it is. When you say "just filtering", it
No, I don't think I am. :)
sounds like you're talking purely about UI: which packages (and package releases) are displayed to the user in the package loader.
No, I was talking about filtering the model, in a general sense.
But what Lex is saying is deeper than that: first, that these "filters" will affect which versions of required packages are used to fulfill dependencies by default (so it doesn't just filter what the user sees, but in some sense what the dependency engine sees as well), and second,
Of course.
that because of this behavior you can get away with a much simpler dependency system - because it's the filters that are doing most of the narrowing down to a particular release, and so the dependency engine doesn't have to.
And you will be limited to that "universe", which I find to be a BIG negative.
I simply think having a "full" model which you (or the depency engine) can just view a subset of is much more powerful than having multiple separate models and have all kinds of trouble trying to combine them and still maintain "data integrity" etc.
Note though that in this case I am talking about "universes" meant to represent different subsets - like "stable" vs "unstable" etc. Lex still has a good point in the ability of having multiple maps but for *visibility*, like a private map for a company etc. And I still fear the consequences of enabling that without thinking hard about it. :)
That's not a trivial thing, and it's not just a UI layer on top of the SM mode: it's a very different way to think about the dependency model, and I think you should take it into account.
I am taking it into account.
FWIW, it also aligns pretty well with the direction I've been meaning to take Monticello-level dependencies/tags...
Avi
regards, Göran
goran.krampe@bluefish.se wrote:
...
lex@cc.gatech.edu wrote:
...
Testing and unstable are different universes in Debian, and they should not be mixed. In fact, the Debian
What do you mean should not be mixed? I think that is up to the user.
tools do *not* mix them unless you go hack the files manually or unless you circumvent the main tools. SqueakMap, on the other hand, has
That is wrong AFAIK. I used Galeon from unstable when running testing. Don't recall all the details on pinning etc, but mixing sure is doable.
One remark regaring mixing: I think it is possible to mix.
But in this case, the 'measurement' of compatibility (needed for the DepS system in my mind) between different packages (requires, conflicts) and between different package releases of one package ('compatibility code' in recent terminology) is more work for the maintainers. E.g. they had to give some hints about how compatible a package from one universe is in the other one, in *addition* to the normal hints about how compatible it is in just one.
So it is good, that there are ideas for other mechanisms, too, *not* needing the compatibility 'measurement' at all (configuration and anti-configuration).
Greetings Stephan
...
Stephan Rudlof sr@evolgo.de wrote:
One remark regaring mixing: I think it is possible to mix.
But in this case, the 'measurement' of compatibility (needed for the DepS system in my mind) between different packages (requires, conflicts) and between different package releases of one package ('compatibility code' in recent terminology) is more work for the maintainers. E.g. they had to give some hints about how compatible a package from one universe is in the other one, in *addition* to the normal hints about how compatible it is in just one.
Note that the "hint" could be:
A addPackage: P
But then again, at that point, it is no longer "mixing". :)
-Lex
Lex,
lex@cc.gatech.edu wrote:
Stephan Rudlof sr@evolgo.de wrote:
One remark regaring mixing: I think it is possible to mix.
But in this case, the 'measurement' of compatibility (needed for the DepS system in my mind) between different packages (requires, conflicts) and between different package releases of one package ('compatibility code' in recent terminology) is more work for the maintainers. E.g. they had to give some hints about how compatible a package from one universe is in the other one, in *addition* to the normal hints about how compatible it is in just one.
Note that the "hint" could be:
A addPackage: P
this hint doesn't help. Just adding the package to another universe doesn't say anything about its compatibility in it. You need another 'measurement' related to the new universe. At least as I see a reasonable compatibility measurement, see http://minnow.cc.gatech.edu/squeak/3792 (what do you think about it? Note: not finished yet.).
Greetings Stephan
But then again, at that point, it is no longer "mixing". :)
-Lex
Stephan Rudlof sr@evolgo.de wrote:
lex@cc.gatech.edu wrote:
Stephan Rudlof sr@evolgo.de wrote:
One remark regaring mixing: I think it is possible to mix.
[...]
E.g. they had to give some hints about how compatible a package from one universe is in the other one, in *addition* to the normal hints about how compatible it is in just one.
Note that the "hint" could be:
A addPackage: P
this hint doesn't help. Just adding the package to another universe doesn't say anything about its compatibility in it. You need another 'measurement' related to the new universe.
I was talking about how you *record* the hint, not about how you measure it. Instead of having an annotation on P that says "I seem to work in A", you can simply add P to A. This is a very important point. It is pretty much the definition of what a universe is.
It also means you don't need to mix.
Regarding compatibility codes: they sound very nice in themselves. The main thing that worries me is that "backwards compatibility" is a tough question if you do not have any context about what other stuff will be loaded in the image. Let me give three examples.
One example is that you may want an update to be "compatible" for the purposes of an unstable universe, but not compatible for people working in a stable universe. You may know there are some bugs in obscure cases, but you may still want the auto-installer to offer your package to people working in the development universe. Whether you say the update is compatible or not, it will be wrong for the many people operating in one universe or the other.
Another example would be where I make a new version of a package and have it use the new LargeList selectors. Is this backwards compatible? In a 3.7 image, this is perfectly backwards compatible, because it has LargeLists in it. In a 3.6 image, the code will stop working entirely, so it is surely not a compatible upgrade. How do I mark this? However you mark it, is going to be incorrect for many people.
A third example would be if you remove a method that was deprecated in 3.4. That's completely backwards compatible in a 3.7-based universe, because no one is calling the method any longer, but of course it breaks compatibility in a 3.4-based universe.
All of the above questions are easy if you know which universe you are talking about, and unanswerable otherwise. Thus, it seems that compatibility codes should be associated with (universe,package) tuples, not with bare packages.
-Lex
Lex,
lex@cc.gatech.edu wrote:
Stephan Rudlof sr@evolgo.de wrote:
lex@cc.gatech.edu wrote:
Stephan Rudlof sr@evolgo.de wrote:
One remark regaring mixing: I think it is possible to mix.
[...]
E.g. they had to give some hints about how compatible a package from one universe is in the other one, in *addition* to the normal hints about how compatible it is in just one.
Note that the "hint" could be:
A addPackage: P
this hint doesn't help. Just adding the package to another universe doesn't say anything about its compatibility in it. You need another 'measurement' related to the new universe.
I was talking about how you *record* the hint,
But I was talkin about "they had to give some hints about *** how compatible *** a package from one universe is in the other one" (see above!)...
not about how you measure it. Instead of having an annotation on P that says "I seem to work in A", you can simply add P to A. This is a very important point. It is pretty much the definition of what a universe is.
OK. But you also have to add a compatibility measurement for each package version for each new universe (as you have stated below, too).
It also means you don't need to mix.
Regarding compatibility codes: they sound very nice in themselves.
:)
The main thing that worries me is that "backwards compatibility" is a tough question if you do not have any context about what other stuff will be loaded in the image. Let me give three examples.
One example is that you may want an update to be "compatible" for the purposes of an unstable universe, but not compatible for people working in a stable universe. You may know there are some bugs in obscure cases, but you may still want the auto-installer to offer your package to people working in the development universe. Whether you say the update is compatible or not, it will be wrong for the many people operating in one universe or the other.
Another example would be where I make a new version of a package and have it use the new LargeList selectors. Is this backwards compatible? In a 3.7 image, this is perfectly backwards compatible, because it has LargeLists in it. In a 3.6 image, the code will stop working entirely, so it is surely not a compatible upgrade. How do I mark this? However you mark it, is going to be incorrect for many people.
A third example would be if you remove a method that was deprecated in 3.4. That's completely backwards compatible in a 3.7-based universe, because no one is calling the method any longer, but of course it breaks compatibility in a 3.4-based universe.
Your examples are good for illustrating my original point.
All of the above questions are easy if you know which universe you are talking about, and unanswerable otherwise. Thus, it seems that compatibility codes should be associated with (universe,package) tuples, not with bare packages.
Exactly (if packages are package versions/releases (I've been sloppy here above, too))!
But it is also possible to put these as (universeAttribute (e.g. Squeak3.7), package version (!), compatibility) tuples in one universe, isn't it?
Greetings Stephan
-Lex
There has been many suggestions about how a package can survive a change in its class library. I believe the only workable solution is to let a package include all required objects and ignore any new versions that may come along.
Example 1: In a very early version of Tektronix Smalltalk, we compensated a off-by-one error in a library class by adding 1 in the subclass. We then got a new and corrected version of the library. Out application was again off-by-one...
Example 2: A program working with bitmapped graphics worked perfectly until we got a new version of the VW class library. Our response time then went up from milliseconds to hours. There were no interface changes, but a library class that formerly cashed a transformed bitmap now computed it afresh at every request. Cashing it outside the inner loop in the app method got us back to millisecond response time.
Example 3: "On June 4, 1996 an unmanned Ariane 5 rocket launched by the European Space Agency exploded just forty seconds after its lift-off[...]It turned out that the cause of the failure was a software error in the inertial reference system. Specifically a 64 bit floating point number relating to the horizontal velocity of the rocket with respect to the platform was converted to a 16 bit signed integer. The number was larger than 32,767, the largest integer storeable in a 16 bit signed integer, and thus the conversion failed." Here it was the same program using the same library, it was merely a faster rocket.
The surface area between a system of classes and their superclasses is enormous, there is no way the library developer can foresee the effect of a change on the many and unknown users of the library. My own experience is also that it is practically impossible for an application developer to see the effect of hundreds of changes in the library even if they are carefully listed (as PPS used to do).
The solution I dream about for BabyUML is that it shall support many versions of a class simultaneously. Library classes shall not be known by name, but by object ID. The loading of a package should automatically cause the loading of the objects it requires if they are not in the image already. Object IDs must be unique in time and space. I estimate 90 bits, but local aliasing should be possible for efficiency. It seems sensible to use a finer granularity than class objects, "Traits" looks very promising in this context.
Finally, testing can only show the presence of bugs and not the absence of bugs. Bug free software takes bug free design and bug free implementation. The purpose of testing should only be to confirm that everything is OK. But our present programming paradigms do not yield readable programs. The goal of the BabyUML project is to make it practicable to create simple solutions to complex problems.
Cheers --Trygve
Hi Trygve and all!
Trygve Reenskaug trygver@ifi.uio.no wrote:
There has been many suggestions about how a package can survive a change in its class library. I believe the only workable solution is to let a package include all required objects and ignore any new versions that may come along.
It is worth to note here that the compatibility level idea is not meant to be the fundamental mechanism in the dependency model - it is instead meant as guidance, especially when the combination of packages creates conflicts in version (Package A1.0 needs B1.0 but C1.1 needs the newer B1.1) and you might need to try using a newer release that hasn't been tested yet.
So... the idea is to use the tested specific releases of required packages - just like you say - if there is no conflict. But when conflicts arise the idea is to be able to get guidance in trying new combinations.
The idea of being able to have multiple versions B1.0 and B1.1 installed *at the same time*, as you suggest - is an idea that both doesn't have support in Squeak today and would also lead to... IMHO confusion and possibly strange social effects.
With "social effects" I mean that it may very well lead to less reuse and less pressure on us developers to play nice with each other and build layered systems and/or respect each others development cycles. I mean, getting a statically built executable installed in my Linux box is simple - but the fact that it has "its own" of everything built in is not always a good thing.
[SNIP of interesting examples]
The surface area between a system of classes and their superclasses is enormous, there is no way the library developer can foresee the effect of a change on the many and unknown users of the library. My own experience is
True, the developer can not foresee *the exact* effect on *all* users of the library. But he should be able to assess the impact of the change. IMHO a working solution in this area needs to be based on pragmatism and social behaviour much more than theoretical mathematical exactness.
In this line of thought it is simply better if there is information about the expected impact - even if it will be wrong in x% of the cases.
also that it is practically impossible for an application developer to see the effect of hundreds of changes in the library even if they are carefully listed (as PPS used to do).
Definitely. But I still think it should be possible to classify the level of compatibility, at least without being totally off. :)
The solution I dream about for BabyUML is that it shall support many versions of a class simultaneously. Library classes shall not be known by name, but by object ID.
Similar to what Craig wants with his Squat work I presume. I still think it is a simplification that doesn't hold.
The loading of a package should automatically cause the loading of the objects it requires if they are not in the image already. Object IDs must be unique in time and space. I estimate 90 bits, but local aliasing should be possible for efficiency. It seems sensible to use a finer granularity than class objects, "Traits" looks very promising in this context.
Finally, testing can only show the presence of bugs and not the absence of bugs. Bug free software takes bug free design and bug free implementation. The purpose of testing should only be to confirm that everything is OK. But
I agree, testing shows that "the things being tested works". It doesn't say anything about the things not being tested. :)
our present programming paradigms do not yield readable programs. The goal of the BabyUML project is to make it practicable to create simple solutions to complex problems.
Cheers --Trygve
regards, Göran
Hello All,
goran.krampe@bluefish.se wrote:
Hi Trygve and all!
Trygve Reenskaug trygver@ifi.uio.no wrote:
There has been many suggestions about how a package can survive a change in its class library. I believe the only workable solution is to let a package include all required objects and ignore any new versions that may come along.
It is worth to note here that the compatibility level idea is not meant to be the fundamental mechanism in the dependency model - it is instead meant as guidance, especially when the combination of packages creates conflicts in version (Package A1.0 needs B1.0 but C1.1 needs the newer B1.1) and you might need to try using a newer release that hasn't been tested yet.
So... the idea is to use the tested specific releases of required packages - just like you say - if there is no conflict. But when conflicts arise the idea is to be able to get guidance in trying new combinations.
My view is slightly different: I would like to give the compatibility level mechanism more importance.
For me the compatibility level mechanism could serve as one base for automatically upgrading. Upgrading a lib for a new package, while staying compatible with other packages written for an older version of this lib, is just *one* case of automatically upgrading. Note: this does not hinder an UI to ask the user for confirmation of the suggested moves.
The compatibility information could be one filter for package versions to be chosen as upgrade candidates. In an ideal world this would be sufficient! In a not ideal world 'working configurations' and 'not working configurations' (the latter may be used for automatically expressing them as conflicts) are other informations well suited to improve the chances for a successful upgrade.
Göran seems to see the importance of these filters the other way around: but this is somewhat naturally and it should be possible to control and prioritize these filters by policies. But the effectiveness of *both* highly depends on the participation of the people: the compatibility info more from the package maintainers, the working/not-working configuration info more from all users.
Greetings Stephan
...
On Jul 25, 2004, at 10:14 AM, lex@cc.gatech.edu wrote:
Let me give some specific examples. Here are some things that should be doable at a local level, without involvement from any central authority:
- The creation of new universes that draw packages from an existing
one. The local guys should not have to coordinate with the server you are drawing from.
- The creation and maintenance of user accounts. Individual
organizations should be able to have their own accounts without publishing them publically.
- The designation of who has permission to post package versions to
each server. Particular package universes should have their own rules about this, that even the original owner of a package cannot necessarily override. (Even though in the main Squeak repository, we want to give more deference to the designated package owner.)
I'll note that these are all things that are currently doable on a package by package level, using Monticello repositories, even if not at the SqueakMap level. I wonder if one solution to this argument about distribution vs. centralization might be to factor the model such that SqueakMap is only responsible for those aspects that we all agree should be managed centrally, and use other mechanisms (Monticello, update streams, Squat) when we want to be more distributed. I certainly agree with your three examples - that's why Monticello was designed to the way it was. But I also agree with Göran about the benefits of having a single, central repository of metadata - it's just so valuable for someone to open a Package Loader in a virgin Squeak image, without having to do any configuration, and have one-click access to essentially any package anyone has released for Squeak. But we could still have SqueakMap as the central source of information about which packages exist, and have it point the user to the repositories that have information about specific versions (that have been released in a non-centralized process by various people at various organizations).
To put this another way, what SM does, and does very well, is to act as an entry point: it lets the user get from "I wonder if there's a package that does X" to having a URL for that package. As Göran says, it's a catalog. But it's fairly relaxed about exactly what it is that it is cataloguing. There's no reason that URL has to point directly to a particular piece of code; it could be a URL to a directory full of package versions that were managed totally separately from SM. One could, for example, abuse the list of SM releases for a package to instead act as a list of sources (and leave the password open so that anyone could add one). Then you just need the right code in the image to use that data in appropriate ways.
Avi
Hi Lex and all!
First - Lex - if you think I am missing anything in my replies, please post again. I am getting slightly drowned right now. :)
lex@cc.gatech.edu wrote:
- My number one desire, which is not on your list, is to stop talking
about "the" map. At the very least, there should be a map per version of Squeak, because in practice packages in Squeak 3.6 are not going to work in Squeak 3.0 unless you make at least a few changes.
I think there are/will-continue-to-be plenty of packages that would work across multiple versions of Squeak. Any "independent" package that does not depend highly on the volatile-guts of Squeak.. a Chess engine, for example.
I didn't say you couldn't post a package to multiple maps. I expect that it will be less common than you seem to think -- for example, the changes to #new caused a *lot* of legacy code to break -- but whatever the ration ends up being, there are definitely packages already that want different versions in different versions of Squeak.
Less common? Hardly. There are TONS of packages that work in 3.4, 3.5, 3.6, 3.7 etc.
And AFAIK merely a few that have been split up with different packages for different versions - and do note that *that* isn't needed anymore. SqueakMap handles this today. Yes, it does. Perhaps not perfectly, but it does AFAIK.
I have deleted 50-odd lines of commentary about this; if you really disagree that we do frequently want different versions of a package in different versions of Squeak, I'll edit it and post it for your consideration. Assuming that we agree that we sometimes want to have different
No, I agree with that. What I *do not* agree with is that it would somehow be better/simpler by having multiple maps. I have yet to see a single convincing argument of that.
versions of the packages for different versions of Squeak, we must come up with some way to implement it. It seems to me that having separate
What do you mean? It is already implemented. Create a release and select what versions of Squeak it works with. Then create another release and select what versions it works with.
The thing that we don't have a UI for yet - but this might be a good time to add it - is branches. The model handles them.
maps results in a more pleasant system than what we have now, where people split their packages manually. And there are other benefits in
No need to split anymore. The splits happened AFAIK before we added support of releases.
addition to a convenient way to have multiple versions per package, as I have described in other posts.
Not sure what those were.
I don't have anything against having another map, but I do like being able to go to ONE map for everything I need, no matter what version of Squeak I need it for.
Your goal and your strategy are inconsistent. Today, we have a single map, and yet you cannot simply go to SqueakMap and see what packages work with the version of Squeak you are currently running. There are version tags there, but the tags are incorrect. Keeping the tags up to date is a real nuisance! Imagine a Squeak user of today loading Squeak 3.4 and then opening the Package Loader. How many of the packages they see are going to actually work when they install them?
Eh, I am not sure you are aware of what SM does today. Today it would work perfectly well. If we disregard the fact that the map contents aren't consistent with the current features of the map - it handles it as it should.
Use the filters and select "new safely available" (in a 3.4 image). Note that I haven't tried running current SM in 3.4 - not even sure it works. :) That would show all package with a mark of "Squeak 3.4" on either the package itself (implying all releases) or on at least one of its releases - and which of at least one is published. Well, it is even a bit more refined than that - see SMPackage>>isSafelyAvailable.
For some numbers here, there are 221 package with release(s) (or the package itself) marked for 3.6. 178 for 3.7 and 89 for both. This both tells me there are TONS of packages that are available for more than one Squeak version and it also tells me that there are many maintainers that have managed to put that 3.7-tag in place.
It seems to me your whole argument has missed the fact that we have *releases* now.
On the other hand, multiple maps make your goal easy. The equivalent of the version tags would be kept up to date in the normal course of operation. Consider. Before you post a new version of a package, you will do at least some minimal amount of testing in each image, such as loading the program and seeing that it starts okay. Once you are done testing in each image, you'll open the Package Poster and press the "POST IT" button. Which map do you think it will post it to...? It will post it to the map that corresponds to the image you just tested in, surely.
What is this? Why are you saying that this is easier than what we have now (a part from the UI)?
First you say it is a nuisance to keep the tags up to date in the current SM and then you say this? You totally lost me. There is no difference at all. In fact - current SM is easier because you can test the release in 2-3 different Squeak versions and then make the release once and for all.
Now picture what other users are going to see when they open up the Package Browser in any particular image.... they are going to see a map containing the most recent version of each package that has been tested in the image version they are using. If you did your testing in 3.6 and they are in 3.4, then they will simply and automatically see the last version you *did* test in 3.4. It all works so simply and smoothly that I daresay people won't think much about it at all, and they'd wonder what all the fuss was about if they were to read these threads!
I am sorry Lex, but this all works today. Open the package loader, filter on "display only safely available". Done. The only thing I didn't have time to add is to see which ones of the releases (of each package) are actually "filtered out" - I elected to show them all at this time, because hiding releases may be confusing. But it would be nice with a visual cue to see which specific releases are "safe" for you.
But the listed packages should be the correct ones, and if you use "install" on the package (and not a specific release) it should pick the last release that is *published* and *for your Squeak version* first. Then it tries to fail "nicely" and offer other choices. Sure, it can surely be improved.
I think it depends on the person. Some people almost-always just want the latest-known "working configuration," even if that means they do not have the latest-and-greatest of every particular required package. For example, if you have a dedicated computer to perform a menial chore and you don't need or care about having the latest, you just want the computer to do the work reliably (i.e., capture weather data, for example).
In this case, it's useful to be able to do a one-click install and KNOW that the entire configuration will work without having to debug or wonder if there's some change in one of the newer-versions of one of the sub-packages that is going to render it buggy.
In the extreme case, you won't update that computer at all, in which case all of this discussion is moot.
If you do occasionally update it, however, you will typically want to grab all of the bug fix updates that are out there, while avoiding any updates that add new features. Multiple maps solves this beautifully. Simply have a different updating policy for each map. A map for Squeak 3.6 would only allow updates that have been thoroughly tested and which do no more than fix bugs. End of story.
So what if I want a stable KomHttpServer, but a bleeding edge Shout? Then I will need to connect to multiple maps at the same time? Do you have any idea how much more complex that would be compared to what we have today? How would multiple maps share categories in a sound way? Or information about maintainers? How would the dependency management work later on?
And don't say "Debian" - I know how it works there - but SM is more than that. It can already do more in many areas. (and less in others)
With single maps, things are more complicated....
No, they aren't. On the contrary. Having a single domain model instead of multiple is per definition easier. There is nothing stopping the model to embrace all these mechanisms and this is what I am working to do.
By the way, you used the word "know" very quickly. Not only is a lack of errors impossible to accomplish, but people who want to try hard to achieve it are surely not going to want to rely only on random testers from out on the Internet. If you really want a super-stable Squeak image, then you are going to have to test it yourself, in a specific configuration. Even if someone out on the net tested that exact same set of package versions, you still would not want to automatically use that configuration. You would want to test it yourself.
Noone is saying "automatically". It is all about collecting information in a coherent model. It is easier to collect this information in ONE model.
Also by the way, the approach I described is exactly what Debian does, and it seems to work well. Debian is an excellent choice of Linux distribution for high-reliability situations, precisely because it does have separate streams of packages for "stable" and "unstable" systems.
There is NOTHING that prevents us from having the same in SM.
-Lex
Lex, I am getting slightly tired of this discussion because it doesn't seem to me you are even trying to look at how SM works.
I know docs are lacking, but the source is out there. Classes SMLoader, SMPackage, SMPackageRelease are kinda simple to look at I think.
regards, Göran
PS. This just proves I really, really need to write that SM article...
squeak-dev@lists.squeakfoundation.org