Saturday, February 12, 2011

Impact of maturing distinctions (or how the need for increasingly rich information generates software engineering challenges)...

One of the things that I have noticed over my years as a software engineer as been the process of capturing some particular set of "requirements" and constituting them as rules in a software system I end up building. And then over time, the need for the system to be able to respond to subtle increases in information distinctions exposes software adaptation blocks; i.e. software design fragility. And my response has been to try to design defensively around these areas. The end result is that not accounting for distinctions early can generate changes that hit the core system design and ripple throughout the resulting complex system. And accounting for the more refined distinctions too early can prematurely make the system too complex increasing risks to the system's continued useful existence.

Rather than talk in abstractions, here's a more concrete example of the process of a particular data point in a system "growing" to need higher degrees of details in the information.

Let's use the notion of something nice and nebulous, hunger. The requirement is to include some value to indicate whether an instances of a digital organism has it's hungry signal turned on. In Java, this would be represented as a boolean, as in some field of a class:

private boolean hungry;

After we've been working on the overall system for awhile (possibly already released a version of it to the public/production/some-irreversible-external-thing), we discover that we need a bit more information. Rather than track hunger. We are really tracking the degree of motivation to seek food based on a set of discrete values; HIGH, MODERATE and LOW.

Obviously, this trinary of discrete values won't fit in our boolean. This means we have to change from boolean to something else. Thanks to our looking ahead, we can see there may be additional discrete values added in the future (imagine something like WEE_BIT_PECKISH or EXPLOSIVELY_BLOATED), so we just make the value an int (or an Enumeration which is a more effectively adaptive way to essentially use the equivalent of an int). And using constants, we now represent the trinary. So, we enhanced the ability for the variable to represent information from 1 bit (boolean) to 2 bits (trinary) and with the ability to more easily expand in the future to as much as 32 bits. It would now look something like this:


public enum HungerLevel{
    HIGH
  , MODERATE
  , LOW
}
private HungerLevel hungerLevel;


Again, the system has been "published" when the next requirement comes in. There is a need to be able to capture hunger more specifically, as a decimal number (pretend that hunger is now being discerned by testing for the presence/absence of specific neurochemicals). So, we need to move to a continuous value, or scalar.

private double hungerDegree;

And a short time later, there is a new discovery. It turns out that the human brain has two separate modules for reflecting the individual's hunger state. One indicates that the person is needing to eat to elevate blood sugar. And the other indicates the person is sated as enough fat has been detected to have been ingested. So, now TWO scalars are needed. And yes, this why you can feel very full after a huge turkey day dinner and still crave more pumpkin pie.


private double satiety; //affected by recent fat level intake
private double bloodsugar; //affected by recent insulin spike and drop-off 


Additionally, it turns out these values shift independently over time. So, we need to add a third variable, time, and capture the two scalars for each time unit.

private double[][] digestiveHormonalState = new double[][] {{s1, b1}, {s2, b2}, {s3, b3}};

So, we have moved from a boolean to a discrete to a scalar to a pair of scalars to a time based array of pairs of scalars as the system requirements adapted. This plays havoc with most software APIs. However, this model is strongly reflective of the real world problems facing a software engineer as a system is attempting to be designed, built and maintained over time.

There's no huge insight here. I just found this pattern to occur repeatedly in human meaning systems outside of software engineering. And I found the pattern interesting enough to share. And I am looking for ways to be able to capture meaning adaptations (i.e. enhanced distinctions) while minimizing the overall impact they have to systems built upon them.

Saturday, February 5, 2011

Attumi (email Batch D) - A Java-like Language with default immutability

This is the fifth in a series of 5 posts:

This is the final post showing the self-email-chain I wrote in response to my creative brainstorm started in 2010/November regarding my designing a "What's next?" language to follow Java.

Here is the fourth and final batch notes, mostly unedited:

<BATCH_D>
  1. make sure that interfaces provide extendable/configurable factories for implementations based on general performance requirements (as opposed to encoding them in a particular implementation strategy available at the time the code was written) - this will enable better and deeper implementations to be written, thereby elevating the available performance without code recompiles - it will also allow better self-tracking resources which can replace their implementations with better strategies through accumulated metrics (perhaps with secondary thread "watchers" who keep the metrics costs away from the primary code pathway)
  2. facilitate collections code mapping to support a SQL query style syntax (should work very efficiently with immutable object trees) - thought based on this FLOSS extension for Java: http://josql.sourceforge.net/index.html
  3. vastly reduce the cost of composition to reduce the desire to use inheritance - consider having a class able to implement an interface and also specify the internal reference to a class to which all non-reimplemented methods will be forwarded (UnassignedReferenceException thrown if method forwarding attempted and instance has not assigned the internal reference)
  4. move equals() and hasCode() out of Object and into an interface - logically groups the functionality so that if one is reimplemented, so is the other - do not implement at Object...do like Eiffel and have as independent implementation which can be specialized/extended via inheritance or interface/composition
  5. consider adding Iterable interface by which is optimized for use in for() loops
  6. provide tuples (temporary grouping of values without methods without having to formally define a class, i.e. a temporary struct) for both arguments into a method (like Java's varargs) and as a return value - should minimize amount of boilerplate "temp" code required to implement simple composition and re-implementation classes and methods
  7. default to NotNullable for all declarations and require Null be explicitly requested/defined
  8. default to methods NOT being overridable (must explicitly identify method as being overridden)
  9. can allow decendants to extend (don't have to make final) immutable classes as the immutable guarantee holds
  10. when determining requirements for collection type, use these characteristics: A) identity duplicates allowed (==), B) value duplicates allowed (.equals), C) value similarity allowed (Comparator), D) iterator ordered (Comparable.compareTo()), E) immutibility, F) degree in which to favor speed (of specific operations) versus space (weak and/or lazily instantiated context) - use a builder with these parameters to return appropriate implementation to specific interface - enable plug-in replacements at Builder itself (for replacement implementations)
  11. in specialized file for enhancing execution optimization logic, allow alternative lower level optimization (perhaps even assembly) where the context of the assumptions outlining the use of the alternative code would be asserted and it would only call the alternative if the assumptions were true
  12. consider all potential sources of side-effect randomness and attempt to eliminate to allow perfect deterministic models to be written without accidental side-effects (ex: Java HashSet/HashMap.keySet() and the multi-threaded GC interaction resulting from the default implementation of hashcode being an interpretation of the instance's memory location - use in thread Random)
  13. Research Google's Go as a possible influencer on direction of design : http://golang.org/doc/go_faq.html
  14. Good Java concurrency subtlety article (influence how operators are grouped - increase domain of atomicity - or explicitly disallow the syntax - i.e. assume immutable and multi-threaded throughout the entire language - create "safe zones for mutation" and for "thread communication"): http://www.ibm.com/developerworks/java/library/j-concurrencybugpatterns/index.html
  15. Nice coverage of using inheritance in OO versus delgates in functional languages (need to facilitate both) using Clojure: http://www.ibm.com/developerworks/java/library/j-clojure-protocols/index.html
</BATCH_D>

As I said with batches A, B and C, depending upon interest, I will consider diving into some/all of these and exploring my thoughts and reasoning.

Attumi (email Batch C) - A Java-like Language with default immutability...

This is the fourth in a series of 5 posts:

This is the next post showing the self-email-chain I wrote in response to my  creative brainstorm started in 2010/November regarding my designing a "What's next?" language to follow Java. 

Here is the third batch (of four) notes, mostly unedited:

<BATCH_C>
  1. define notion of an init-once, read many method (for lazy initialization) to clear overhead of check for initialization every time method called
  2. default of transparency (everything is public) and hidden state must be specifically called out
  3. check out http://fantom.org/ for their approach to multi-threading (i.e. actor based with immutability)
  4. allow passing parameter by name (while ensuring no speed gain/loss for using name as opposed to position - use stubbed API call point to map name to position
  5. allow defaulting of unpassed parameters - when combined with parameter by name, vastly reduces boilerplate code for implementing class constructors
  6. allow struct/tuple (multi-value) return instance so return from function can be multi-dimensional without formalization of class definition (temp class which will likely reify into real class later)
  7. consider removing null entirely (force the meaning of empty, or undefined)
  8. mature versions of modules, packages, access along with something like partial classes (develop requirements for information hiding versus deployment versus security) 
  9. promote aspect oriented programming (useful for logging/transactional separated from functional code) - consider it being useful for debugging (and during debugging, allow aspects to be explicitly skippable)
  10. consider defining a "test" characteristic that when a class is defined as a test, it can have access to more private parts of a package/module so as to allow for exhaustive testing without impacting the design by forcing the package/module to expose things to the world just so testing can be complete
  11. facilitate immutable pathing to enable soft/weak reference implementation - lazy instantiation pathway can be marked with degree of expense in regenerating so that how that reference is stored (more local to CPU all the way to some form of swapped network based storage) enabling a continuously changing set of options for memory locality - also enables regions to cycle through heating up (come closer to CPU) and then cooling down (push closer to network storage)
</BATCH_C>

As I said with batches A and B, depending upon interest, I will consider diving into some/all of these and exploring my thoughts and reasoning.

Friday, February 4, 2011

Attumi (email Batch B) - A Java-like Language with default immutability

This is the third in a series of 5 posts:

My last post, ...(email Batch A), explained the creative brainstorm started in 2010/November regarding my designing a "What's next?" language to follow Java. And I shared the first email covering my first batch of self-directed notes for exploring the things I would want to consider incorporating into a design.


Here is the second batch (of four) notes, mostly unedited:

<BATCH_B>
  1. consider use of transactional memory model to handle concurrency (as opposed to lock based model), push distributed (Erlang-like) model into separate/library/framework
  2. Source occurs as a named resource (name spaces are merely dot separated identifiers) and stored as ASTs (expressed both in platform independent binary as well as well formed Schema defined XML file) as opposed to the currently more common text file - enables ability to have text representation be human language independent (i.e can be represented in English, Chinese, etc.), allows personalized layout and indentation (avoiding entirely issues around syntax formatting flame-wars) and enables source to reside in a file, set of folders/files, different kinds of DB, over network streamed from undefined data source conforming to API (ex: Web-dav)
  3. when defining reference, possible states {once-at-definition, *once-post-definition*, many} X {*no-null*, nullable} where * is default
  4. when defining class, possible states {*immutable*, mutable} X {*not-derivable*, derivable} where * is default
  5. enforce immutable by requiring all references be to immutables, i.e. if class is immutable, the semantics are guaranteed to be immutable all the way down through any references chains (will enforce acyclic graph, i.e. child cannot contain reference to parent or parent's parent, etc.) - must use external supplemental data structure to facilitate traversing child to parent relationships
  6. execution start-up will pass "main()" a map of key/value pairs - which will be able to be represented as a simple property file/stream, XML file/stream, etc. will be completely generified to make code not care how the execution started and obtained the initial values
  7. design ability for an immutable data structure to lazily fill in state without requiring all state to be completed in definition by the time the constructor terminates (may be challenging to prove from a code path perspective, if not careful)
</BATCH_B>

As I said with batch A, Depending upon interest, I will consider diving into some/all of these and exploring my thoughts and reasoning.

Attumi (email Batch A) - A Java-like Language with default immutability

This is the second in a series of 5 posts:

Early last November (not December like I said in my "Why?" post), I had lots of things rolling around in my head about what to do "next" after Java. I wanted to complete my board game rules model in Java. I toyed with doing the UI. I was just not at all attracted to the tonnage of boilerplate required to do a Java UI with any tool. And I was already getting sick of Java boilerplate anyway even with all the help from the numerous tools Eclipse provides.

And it looked like Objective-C was just going to be tons more boilerplate, only I had to also learn and become competent in Objective-C and the IOS libraries, first. Not at all attractive, either. I had immersed myself in the C#/.NET books and was actually looking forward to some change and some new things (like LINQ, real properties, etc.) which put C# significantly less long in the tooth as Java (I know, I know, it's coming in Java 7, or 8...or whenever the hell Oracle says it will...eff Oracle).

I didn't plan the thought avalanche. I never do. They come to me completely unbidden. I already have several programming projects and business I wanted to start. And I had at least two business completely unrelated to technology (use, but are not driven by) I was playing with building. My creative juices were flowing in those other areas. And then on the way to work during my commute, it hit me like a lightening bolt in the form of a question...

"What if I was to modify Java until it's default state aligned with my primary use case pathways? What if I made Java immutable by default, and then made mutability a pain to get to and if used?"

What if being mutable had to be defined as "the unsafe side-effecting dangerous malicious memory leaked thread-deadlocking code is here" via copious annotations assisting the compiler (which would complain vociferously if the code writer were to forget the slightest "thar be dragon har" indications)? And that was just the start. Once I had the seed idea to take Java and shift it about until it suited me, I could then make it mean the removal of lots of boilerplate, re-prioritize things that mattered, rip out legacy crap (like raw collections) and then like. It was like a fantastic thought experiment all in my own private programmer's fairy-land.

I then began to privately email myself. Anytime I was coding on any project, home or work, I would just jot down some thoughts around the "pain point" and solution like notions that would come to me. That way, I would just continuously have the stream of thoughts and be able to adapt and play with them as needed. It was so much fun. I could now sit back and just enjoy writing Java code while my brain solved both the current coding problems the "Java" way and another part of my brain was off playing with how it really should work. And the modifications were all the way from very low level implementation efficiencies to very high level pattern extractions/abstractions.

So, what did those thoughts look like? Well, I was going to do tons of clean up on my notes and make them all spiffy. However, I would rather do more fun things like read more about Scala, and code in Scala, and make suggestions/requests to the Scala team around things that burped up for me during the last three months. I made all these notes before venturing out to start researching existing JVM alternatives. I had not investigated a single JVM language until AFTER I wrote all these notes. So, in a way, it was quite pleasant to have a strong sense of what I wanted before I actually took the leap and started researching other languages which might fill the bill. Besides, it's not my style. I like problem solving, even problems that have already been solved. It's why I love to play Go. It's why I do Sudoku, it's why I dig little puzzle games like Osmos and Chime.

Here is the first batch (of four) notes, mostly unedited:
<BATCH_A>
Consider an OO language design where immutability was the default:
  1. immutability assumed unless otherwise explicitly called out and specified (use marker interface mutable in similar way abstract is used on abstract classes)
  2. reduce security issues via content changing unexpectedly in deeper structures
  3. create basic method parameter checks (i.e. simplified preconditions for null, ranged values, size, etc. - most all the pre-conditions typically used to validate basic parameters to a constructor)
  4. use builders to have mutable content until ready to build immutable instance (like StringBuilder for strings, include ArrayBuilder for arrays)
  5. all "primatives" are formal Objects and optimized internally for performance value (including arrays)
  6. add option execution suggestion plan to help structure CPU and memory layout for optimally executing
    • can allow array of Integer to be grouped internally into a memory block of ints[] which are then accessed via pointer arithmetic
    • enables faster iterators where boundary checks can be disabled
    • separates the code abstraction definition (i.e. design) from the actual instruction and memory layout generation (i.e. implementation)
  7. optimized relationship with Collections library (allowing for internal implementation to eliminate unnecessary boundary checks)
  8. consider partial construction and lazy instantiation of additional immutable state as important (so as to only engage in CPU and memory for the execution pathways actually used)
  9. ensure only an immutable instance may be passed from one thread to another (reduces memory coordination/collaboration/optimization issues substantially)
  10. an immutable object is state whereas a mutable object has state
  11. when designing Collection builders, ensure specification of requirements (i.e. read heavy, write heavy, etc.) which then interprets and selects implementation (i.e. ArrayList for lots of random access reading, LinkedList for lots of insertion and sequential traversing)
  12. within implementation of actual compiled code, consider a reference to be more than just a memory pointer, but an actual object representing data
    • for an array of Integer, it could be type, count, size of single entry, pointer to start of a memory block
    • for a specific instance of Integer in the array, it could be type, container reference (above), index (and if immutable, the operation for this fetch only has to occur once to the locality needed to operate on it)
  13. Review Java and Eiffel design for other possible ways to add semantics
    </BATCH_A>

    And that was the first brain-storm. Depending upon interest, I will consider diving into some/all of these and exploring my thoughts and reasoning. If there's no interest (and mine wanes such that I don't bring it up on the Scala mailing lists), it will just dissipate into the Internet ethers. I'm just glad to have had those thoughts pass through my head, get dumped out and can let them go. {smirk}

    Wednesday, February 2, 2011

    Why?

    This is the first in a series of 5 posts:

    The oppositional part of me says, "Because!" However, that would make for a pretty brief blog entry and not leave much else to post later, huh?!

    There are numerous reasons. They cover a wide spectrum. Here are the top three influencers for me:
    1. Java's getting long in the tooth and becoming too complex - it badly needs a reboot as the legacy it's having to maintain (ex: raw collections) is now deeply thwarting effective adaptation
    2. Oracle now owns Java, and I don't trust Oracle. At all.
    3. I want to create multi-core/multi-CPU apps with less effort (than Java) while future-proofing my new programming language investment

    1. Java's Becoming Over-Complex:
    I suppose the strongest reason for me is related to the complexity of Java; i.e. I just finished writing a personal project over the holidays. And I found myself having to write acres of defensive boilerplate code in Java to implement a relatively simple two player turn based game. And when I went to do performance testing, I discovered weird inconsistencies I had not really noticed before. It was a personal learning project, so I was bringing an additional level of scrutiny to what I was doing.

    For example, to prepare for taking my code multi-threaded later, I decided per Joshua Bloch's advice in "Effective Java 2nd Edition" to make as much of the app as immutable as possible. It was quite a bit of work, but I essentially achieved it. And I was quite proud of that. However, I discovered that there was no real safe way to tell if a List had an immutable wrapper already. So, it turned out I was copying, wrapping and then re-copying and re-wrapping the same List and passing it around. And after researching the issue (I even started a StackOverflow question), I found there was no "smell-free" way to effectively detect if a list was already immutable; at least not one that seemed like good OO design which was also high performance. This led me on a chase to see what other options I had besides using the standard Java collections. I quickly found Google's Guava libraries project and it felt like a leap in the right direction. Guava seemed to address a number of obvious pain points I had experienced with Java.

    Another inconsistency in Java was the weird differences between Java arrays and Java collections. Arrays just don't work right and are by default mutable (and there's no real way to make them immutable without TONS of boilerplate). And as I researched that, it appeared to be the challenge of ensuring strong backwards compatibility with original versions of Java from 15 years ago. Java's starting to have a pretty distinct odor around it. It sure appears it's time to stick a fork in it and call the 1.x branch done. And generate a new backwards incompatible (to some degree) version which shakes off some of the old crufty/smelly/dingy stuff from the past.

    2. Oracle's FLOSS behavior regarding the Sun products/projects has been atrocious:
    And then Oracle sued Google over Android (claiming copyright and patent infringements). And the fact Oracle did so using software patents pissed me off to no end. I was already pretty suspicious of Oracle's motives after buying Sun and how they treated the FLOSS communities initially. Then, Oracle's initiating aggressive litigation was just another "annoyance" around Java itself. I no longer trusted Java's steward, Oracle. Here's an article I wrote on my personal blog covering this in a bit more detail. And it isn't just me having these thoughts. Here's a snippet from an article by a world-wide consulting group sharing a very similar view of uncertainty (i.e. instability) around Oracle's now "owning" Java.

    3. Writing mobile phone apps for iPhone/iPad and Android:
    Finally, I am interested in creating an app for mobile phones, and specifically my mobile phone. I had an iPhone 3G (and have since gotten an iPhone 4 upgrade the last week of the 2010). I spent most of October getting the cheapest Mac-Mini I could find so as to be able to learn how to create apps for the iPhone.

    I then spent some time diving into Objective-C. UGH! That was like losing the last 20 years of software language advances in one fell swoop. I am not saying Objective-C is bad. I am saying that I also don't think COBOL is bad. However, neither appears to have any sort of long-term career development positive angles for me. And of course, Steve Jobs mandating that all iPhone and iPad apps must use Objective-C, C and/or C++ this time last year just pushed me over the edge to dig in and research alternatives. I think Job's dictate last year created more interest in Android by developers than anything either Google or the OHA (Open Handset Alliance) could have ever done. So tyvm, Steve.

    After about 5 hours of investigation into Objective-C, I was looking around and found MonoTouch. This meant I could use C#/.NET to create iPhone and iPad apps. And knowing C#/.NET had lots more long-term career advantages than my learning Objective-C. I spent about 15 hours working through my options in the area. I then decided, once I had completed my board game rules model in Java, I would have someone else put the UI on it (via Rentacoder - apparently recently renamed to vWorker) and I would work on converting the model to C#/.NET.

    While reviewing C#/.NET, I couldn't help but notice a very similar amount of boilerplate was required to author classes as I had to generate while in Java. Granted, it was not quite as much. But it was still LOTS. And I have very limited spare time within which to make my learning investments. So, while C#/.NET was WAY better than Objective-C (for me and my values) in that at least I had a garbage collector (talk about removing tons of craptastically intricate Objective-C code), it still felt heavy, like Java...significantly over budget in the complexity department.

    I know, I'll just invent my own programming language:
    And then on the way to work in early December, I had an "aha". A question entered my mind on my morning commute, "What if I were to invent my own Java-like language, but make it default into patterns that were more aligned with the pathways I primarily used?" As soon as I asked that question, my mind began to flood with ideas. And as I worked on my personal project over the next month, I kept doing a reply to an email I sent to myself adding the things I wanted. I covered the gamut, from very low level machine instruction efficiencies all the way up to strategic re-configuration/adaptation of collection classes defaults for running systems. It felt great to have all these ideas flooding me.

    After about 3 weeks of writing emails to myself, I began to share my ideas with a couple of very close friends, explaining my predicament and my "aha". And from those conversations, I began to become convinced the "pain points" in the current tools were exceptionally high around multi-core/multi-CPU regardless of Java, C#, C++/C, Objective-C, etc. And none of them really "defaulted" into a more immutable way of thinking.

    And then I had a conversation with Charlie Strahan, my best friend's son who is becoming a software engineer. I shared with him some of the details of my "programming language invention". He then asked me if I had checked out Clojure and Scala. I said I thought those were JVM scripting languages. He then urged me to check out functional programming, mentioned F# on the CLR, and that he thought Scala was the closest to what I was describing. And that there were some free chapters on the Internet I could read to get a feel for the language.

    First Clojure, and then Scala:
    The next day, I noticed an article on Clojure and began to research it. I spent a couple of hours reading up on it. Basically, it was a JVM implementation of Lisp with some better specialization. The most important part, though, was that it was by default immutable. I was immediately intrigued. I spent a couple more hours reviewing it. That's when a text from Charlie came in directing me to the free chapters from the book "Programming in Scala". I read them. I then started reading any articles I could find on Scala. I was startled that it hit at least 60% of the ideas I had generated on my own. At first I was annoyed. That lasted about 10 seconds until I realized, OMG, this means I don't have to do the work. I can just use Scala.

    And so began my trek into learning Scala. The points which convinced me the quickest were around the language's originator, Martin Odersky. As I learned more about him and his having started in Modula 2 around the same time I was using Modula 2, his being a student of Nikolas Wirth for whom I had great admiration around Pascal, Modula 2 and Oberon, and then his being the originator of Pizza and GCJ (generic for Java before Sun even had the thoughts), his then moving to work for Sun and their requesting the use of his compiler and inclusion of generics replacing their own in-house compiler and finally his finishing at Sun and starting the Scala project around 2003. I read a bit more about Odersky and found myself admiring him and his accomplishments. And then I found "Programming in Scala, 2nd Edition" had just been printed. I ordered it, the physical book and ebook version from artimadeveloper. I got the .pdf, put it on my iPhone and started reading it. And I have not put it down since. That was three weeks ago.

    Back to my Whys and Hows:
    So, that brings me full circle. I still want to port my game app. And I still want to do an iPhone/iPad app. And I found out that Scala is currently being worked to target the .NET CLR (it already targets the Sun JVM). So, if it can target the CLR, and I can get MonoTouch to take the CLR output from Scala and compile it to Objective-C (I know this is a stretch...still lots of work to figure out if this is even possible...I don't know the limits yet), I can use Scala to author those iPhone/iPad apps. And that will make me UBER happy as I will have maximized my new language learning investments across the spectrum.

    So, I intend to post here about my slow but sure process of leaning Scala by programming with an imperative (as opposed to functional) style in Scala on basic starter applications. And then through comments, coaching, cajoling and debate, I hope to quickly hone my skills at seeing how to begin to understand and integrate functional programming into my existing OO knowledge. I am hopeful this creates a nice "default pathway" for those that decide to follow this same path, a gentle approach to first getting Scala's syntax by just converting straight Java to Java-in=Scala. And then little by little (certainly much slower than the way it is presented in "Programming in Scala, 2nd Edition"), I will make the adjustments and leaps to incorporate new concepts, revisiting the code I've already written imperatively.

    Feedback loops are essential to effective adaptation:
    So, don't be shy. Comment away. Tell me I'm being silly, distracted, ADHD, ABM (Anything But Microsoft), ABA (Anything But Apple), arrogant (invent a programming language, are you KIDDING ME?!), etc. The other purpose of this blog is to learn of others who share my frustrations. And or to learn of frustrations others are having so I might discover they are undistinguished frustrations I am also having, but have not had the time or focus to put the words on them myself. And thank you for anything you contribute. I can only grow and learn from whatever it is you choose to share.

    Next Post:
    I will present the (cleaned-up) self-email-chain which lead me to begin exploring Java/C# alternatives, identifying the parts that are already present in Scala, planned for Scala or if still relevant, posed as requests for enhancement to Scala. I am not so presumptuous to think my thoughts were so well thought out as to be able to really be integrated into Scala. However, what I recorded was my personal pain points in using Java (1.6.0) today (well, as of 2010/Dec).