[squeak-dev] valid characters for binary selector

Chris Muller asqueaker at gmail.com
Thu May 30 02:54:05 UTC 2019

> > So is it okay / advisable to go beyond the original spec if it lets me?
> I would prefer an interface that still looks like message sending in the first place. I suppose there are some cases where the parser (or compiler) let's you do something that might just be an oversight. I would consider "- at +" way too cryptic.

Yes, I wouldn't ever use it.  It was just a test to see if the
compiler would accept it.

> That's the path you would be walking on if you follow that:
> self ><>.
> self »-(¯`·.·´¯)->.
> self --------{---(@.

OMG, LOL!!!!!!

> https://1lineart.kulaone.com/#/ :-)


> Not good. Just my two cents.

Yes, definitely not good.  I would never do it but my query is for how
to take foreign code as an input, and I must convert binary selectors
to an alphanumeric to conform to an underlying protocol, (while
avoiding colliding with any keyword selectors), and with a
reverse-conversion on the other end.  So I needed to know all the
possibilities to possibly expect for binary selectors, and even
wondered whether I should disallow some based on some of those
dimensions Jecel mentioned like standards.


More information about the Squeak-dev mailing list