Monday, November 16, 2009

The little things mean the most...

There is a programming pattern that I had never seen before.

That is, there are multiple constructors of a Java class that share a 'base' block of code that needs to be called for each of them.

Well, I try to minimize opportunities for bad typing.  So, when I came across a situation like this - I created a method called base() that would be called by each of the constructors.  That way, the code would exist in one place and would be easier to keep clean and up to date.

Good plan right?

It worked, but today I was playing with the idea of making my big giant app send Twitter updates to let me know what was going on and notify me as errors occured.  The java library that I stumbled onto was twitter4j by Yusuke Yamamoto.

I wanted to make sure that if I tried to send a large (more than 140 character) tweet - it would be handled gracefully rather than getting truncated.  At first glance, they would have gotten truncated.  I'll have to check further to see if this is true.

But, my base() trick is not as clever as I thought.

There was a better way, that I never saw before today.  And that is to use the regular zero parameter constructor by calling it from each of the other constructors as this().  Clean and beautiful - thanks Yusuke.

To myself I say...

Duh.

Tuesday, November 10, 2009

Cookbooks and Recipes: Men, Women, and Programming...

Disclaimer: I think that everyone who chooses to be a programmer has to be (at least) a little bit crazy - whether male or female. But I think we are missing out by not having more female programmers - so if you are female and so inclined then I at least would welcome you. Also, I have a strange sense of humor.

So... Now that we've settled that...

I got to listen to Kirrily Robert speak at ApacheCon last week. She gave a keynote about women in open source and technology called 'Standing Out In the Crowd'. And it struck me that there are numerous books on programming that include the words 'cookbook' or 'recipe' in their titles.

Now, I love to cook - so I don't mean to imply that cooking is in any way a 'womanly' pursuit or task. But, there are many that think of it that way.

Anyway - one of the ideas that are presented as justification for the fact that there are few females in programming is that somehow the 'pink brain' is not suited to mathematics.

I will not even bother arguing whether or not this is true (Kirrily cited a study that determined that there was a slight difference). Regardless of any difference that may exist - it does not matter. I have been a programmer for over two decades and with the exception of a very small number of projects - there has been very very little math that was involved.

What is involved (as far as I am concerned) is understanding a problem and then teaching a computer to go through the steps to solve it.

So, how good a programmer you are is determined by:
  1. How well you can wrap your head around a problem.
  2. How well you can learn a particular language's syntax.
  3. How well you can break the solution of the problem into a sequence of steps.
  4. How well you can codify those steps in the language's syntax.
Most of the time, not much math.

Some languages may lend themselves to solving particular types of problems better or more 'elegantly'. But I have never heard of a language (human or computer) that is somehow easier to learn based on which gender you are.

Sometimes, boolean algebra can be used to make solutions 'prettier'. But most of the programmers that I have worked with never studied boolean algebra - male or female.

Hopefully, math will stop being used as justification for saying or thinking that 'women can't program'. And instead, the idea that programming is really teaching gains more mindshare.

Not every person who decides to be a programmer is cut out for it. There is a certain amount of insanity necessary to be good at it because you are effectively trying to teach a machine to solve a problem by describing the solution in a language that is not natural for you -or- the computer. And the computer is not going to go out of it's way to bridge that gap. That is all up to you!

There is nothing that I can see that makes males more suited to programming than females. And empathy (seeing a situation from the point of view of the other 'person') is often thought of as being a trait that is stronger in females. So, if there is a natural advantage held by anyone - it might be by women.

So, if you have an interest in foreign languages and in teaching completely empty-headed students who will force to you explain every single step in excruciating detail - then programming might be for you.

There may be a very good reason why there aren't more female programmers.

They are smarter than us.

Tuesday, November 3, 2009

Goin back to Cali...

I am in California.

This is the second time that I have been in this state.

The first time was over fifteen years ago.  And I was here to see how a software company was planning on translating their product from a character based program that ran from a command line in either DOS or _nix - to a windows program.

They were going to try to use a program to translate a BASIC language system to Visual Basic.  But the program required a person to make sure that the individual programs were written according to a set of programming standards.

They almost never were - so they ended up doing a substantial rewrite and forced those that had written customizations to write them all over again.

This time, I get to attend the 10th anniversary ApacheCon.

Over the past ten years, the Apache foundation has grown from a small group of people who forked the code from the NCSA web server - to an international group of programmers and users who develop and use a growing set of software projects (currently more than 60 and growing).

About three years ago, my day job gave me the opportunity to develop a substantial new system that would eventually replace a number of scattered information systems that spanned spreadsheets and flat file databases.  It would also integrate that information with the company's accounting system.

In trying to figure out the best platform for developing this new system I stumbled onto WebSphere community edition (WASCE) and through it - Apache Geronimo.

I was hooked.  Here was an environment to develop for that was scalable, standards based, open source, fast, ... everything I could hope for.

Through Geronimo - I found a number of other projects that are integrated by Geronimo.  And each of them was the same - open communities that seek feedback and encourage participation.

I do not know how many people are reading this.  Or how many of you are involved in software development.  But it is an amazing thing to find people spread out across the whole world who are passionate enough to give away their time and energy developing and/or supporting software.

Last time in California turned out to be a moderately interesting waste of time.

This time, nothing has really started yet - and I am already excited to get to be a part of something so bold and (at least in my experience) unique.

This is, in a word...

Awesome.

Wednesday, October 7, 2009

OSGi (or - late to the party again)...

Just when I thought that I might have been getting in on the beginning of 'the next big thing' - I find out that I am actually ten years (years?) late.

Recently the Apache foundation accepted a new podling into the Incubator called Aries.

If you are unfamiliar with the 'Incubator' and/or 'podlings' - take a look at this page on the Apache website:  http://incubator.apache.org/

Anyway, the Aries project seeks to develop the bridge between OSGi and Java EE.

I have come to really like the way that Java EE makes separating the front end development from back end processing.  And since OSGi seems to be the way that much of the Java world is going - I'm trying to get in on the ground floor with Aries.

But, the 'ground floor' is pretty high.  OSGi has been around for ten years now.  And so there is quite a lot of 'assumed knowledge' that goes along with it.  Also, the actual specifications for Java EE in OSGi are still being written and are not expected to be finished until the beginning of 2010.  Aries hopes to help in fleshing out those specs as well as providing an implementation of them.

Well, I should get back to reading the specs that have been written.  They are only 516 pages long (ack!).

So far.

Saturday, August 8, 2009

Excuses...

It is very easy to have being 'too busy' become the reason for everything that you don't get around to doing.

But, I suspect (and very much hope) that getting more organized will help.

I recently tried to put together a todo list and managed to get half a page of software fixes that I needed to do (almost certainly an incomplete list) onto the list along with 'make complete todo list' and 'enter todo list into some kind of software to keep track of it'.

I didn't get much farther than that because I thought that the only way that tracking my list would only make sense if I had a mobile device of some kind that would allow my to access the list when I was away from my computer.  And right now my phone is decidedly 'not smart'.  Which is fine for my phone - all I do with it is make and receive calls.

But, it is not good for me - because that basically ended my quest for becoming organized.

So, this past week I have begun 'Organization Attempt 2.0'.

Rather than putting the list of software issues onto a piece of paper - I finally got around to entering all of them into the issue tracker that I installed over a year ago to just that.  I had been entering them as someone became available to work on them (which makes any kind of planning rather difficult).  I probably missed some (or lots) of issues.  But it is a start.  And as long as I can keep the list updated as I remember them (or they raise their ugly heads) - then I will be freeing up a big block of mental space for more useful purposes.

I am still wrestling with the idea that I need a mobile device to carry around a digital version of the rest of my todo list.  I still have several months left on my mobile phone contract and my provider does not sell the smartphone that I would want to get.  Getting out of my contract is not particularly cheap (over a hundred dollars) and I do not want to settle for one of the phones that they do have.

I want to get rid of that excuse though.

I'm tired of being overwhelmed all of the time.

Tuesday, July 21, 2009

See my pretty hammer...

It happened again.

I learned about a feature in OpenJPA that allows you to specify the fetch behavior exactly as you need it - fetch groups.

And, they are awesome.  My opinion doesn't carry much weight - but if it did, then fetch groups would be part of the JPA spec.

But like every tool - they should only be used when it is appropriate.

One reason that they should be used only as really needed is that in Apache Geronimo, casting a regular EntityManager to an OpenJPAEntityManager causes a new transaction to be started.

In itself that isn't necessarily so bad, but it does begin to clutter things if you use them excessively.  And, that is exactly what I did.

I took my shiny new hammer and started whacking everything.

Including my thumb.

Ouch.

Tuesday, July 14, 2009

I wonder if this box is too small...

I am starting to think that I will need to take another look at the database that I am using.

Up until now, I have been using MySQL because it is free and many (many, many,...) web sites use it for their database back end.

But I seem to be testing its limits - at least on the hardware I'm running it on.

I have been steadily growing a Java app (JEE5) using Apache OpenJPA to connect to the database.

When I first started, the EJB3 and JPA were still in development.  EJB2 was too complicated for the benefit that I thought that I would get so I 'faked it' and created my own psuedo EJB app using: JDBC, servlets, servlets pretending to be EJBs, and POJOs.

The database performance was fairly bad because I have -severely- interlinked data model.  So getting the data to display anything typically involves at least 8 tables linked heavily and repeatidly.

Here is a try at describing something similar to my structure.  If you want a headache, try drawing out the relationship diagram.

  • Master entity (1 -> M) Component
  • Master entity (1 -> 1) Category
  • Master entity (1 -> 1) Sub-Category
  • Master entity (1 -> 1) Sub-Sub-Category
  • Master entity (1 -> 1) Creator (User)
  • Master entity (1 -> 1) Owner (User)

  • Category (1 -> M) Sub-Category

  • Sub-Category (1 -> M) Sub-Sub-Category

  • User (1 -> 1) Department

  • Component (1 -> 1) Component Class
  • Component (1 -> M) Component Attribute

  • Component Class (1 -> M) Class Attribute

  • Class Attribute (1 -> M) Attribute Value (list of possible)

  • Component Attribute (1 -> 1) Attribute
  • Component Attribute (1 -> 1) Value

  • Attribute (1 -> M) Attribute Value (list of possible)

JPA rescued me.  Just changing my data access to use JPA increased the performance of my app about ten-fold.  And, it simplified my code - I was in love.

Later EJB3 session beans took the place of my fake EJBs (servlets).  And I was in love again.

But now, I have millions of rows in over a hundred tables.  Linked together in ways that I could probably not even write JDBC to access anymore.  And the data is stored in MySQL.

Don't get me wrong - I really do like MySQL.  It has been good to me.

But my database with data and indexes is approaching three gig.  As in three billion bytes - and MySQL is starting to have some trouble.

So I think I might need to stray.

Breaking up is hard to do though.

Thursday, July 9, 2009

Wouldn't you know...

Well, I'm trying to move forward with using/learning Groovy.

But, I need to use it with JPA and EJB3 session beans.

In case you were wondering...  There aren't any good ('realistic') examples of how to do this.  At least not that I was able to find on the web.

It may be, that there are books that demonstrate how to use them - I don't know yet.

So, instead of being able to jump in and run -

Baby steps.

Wednesday, July 8, 2009

'Scuse me while I whip this out...

I think that the biggest problem with learning about a new programming language or tool is:

While you are holding your new 'hammer',
everything looks like a nail.

And you need to learn what the real cases are that the new tool is actually is the right one for the job.

Otherwise, you end up writing lots of bad code trying to shoehorn an inappropriate method into a place it does not belong.

For me, there is a second problem that I sometimes have trouble overcoming.  And that is when I already have a solution (possibly a very ugly one) that should have used this new method that I just learned.  In this case, it is scary to break something that works just to do it the 'right' way.

There have been some resounding successes for me recently.

Trading JDBC data access for JPA - Amazing improvement in code understandability and speed.

Trading 'thick' servlets that included large chunks of functionality that really belonged in EJBs for EJBs - Tremendous increase in code resuse and automatic transaction support inherited from the EJB container.

Not every experiment panned out though.  I used a message driven bean to perform a cleanup process asynchronously that really needed to stay synchronous (I still haven't cleaned all of the mess up from that).

So, now - as I try to learn Scala and Groovy (I can't seem to limit myself to one language at a time) it is hard to find when and how to apply them.

Oh well.

If it was easy...

Everyone would do it.

Tuesday, July 7, 2009

Focus...

That is something that I have a combination of far too much and far too little of.

When I am working on something - I can do so completely ignoring just about everything else (fatigue, hunger, distracting noises, etc).

But, there are way too many things that I want/need to learn.  Particularly with computer technology (and languages).

And, since any one would take at least a few months to figure out whether I really want to learn it (and the fact that there are only twenty-four hours in a day) - how could I pick just one at a time?

I should though.  And I know that I should.

Because instead of already having a decent start on Drupal - I have managed to taste:
  • Drupal
  • Ruby (very small taste)
  • Scala
  • Groovy
  • etc
But I cannot do anything useful in any of them.

So - I have to pick one.

And focus on it.

Everything else will have to just go on my todo list.

Where did I put that thing?

And which one comes first?

Scala and Groovy would probably be most useful for work...


Argh.

Monday, June 29, 2009

In case you were wondering...

Well, I managed to put together a todo list.

It is several pages long.  And, I have almost certainly missed half of the things that I need to do.

But, it is a start.  I still need to find some single place to keep/update it.

Until now, I have kept several concurrent todo lists.  One (at least) in my head and several on paper that I usually misplace before finishing everything on them.

One thing that I was on my list as a recurring task was updating this blog: Once a week - on Tuesdays.

I missed out on last week - partly because I was trying to more completely list out the rest of my todo list and partly because I had a thousand other things to do (some on the list and many that slipped through the cracks).

So, I am going to try try to cut myself a little slack and allow myself to make that Tuesday post anytime before the next Tuesday.  This is the first such 'late but not missed' post.

Hopefully, there will be more tomorrow (Tuesday).


But, I guess we'll see.

Thursday, June 18, 2009

The first step...

The first step to really getting organized is to figure out everything that you need to do.

The second step is to order the list in terms of what should be done first, second, etc.

Unfortunately, the first step to becoming totally overwhelmed with what you need to do is the same as the first step to getting organized.

A todo list is supposed to have an end, right?

It will be nice when I find it.

Tuesday, June 16, 2009

Drupal simmering...

I had hoped that I would be able to spend a good chunk of time learning how Drupal works and building something useful with it.

But, real life got in the way - so that has not happened (yet).

And, the reason for that is probably that on my todo list, everything has a priority of 'Should be done Yesterday!'

I think I will try to get my list to have a little more structure.


This is going to be interesting.

Thursday, April 30, 2009

Handing over the keys...

I'm generally not a fan of systems where the computer tries to figure out what you 'really want to do'.

But, I have gotten to a point in my life where I might be open to the idea that it is possible for a development environment to guess correctly.

And so, I am beginning to try out Drupal.

If things work out, then it will make my development of sites easier (I'll still hand code my web -apps- at least for the time being).

Taking a deep breath....

And off we go.

Doc, it hurts when I...

Normally, I would say that if something hurts - you should stop doing it.

But, I didn't know how badly I was hurting myself when I created multiple versions of all of my JPA entities depending on the depth of data that I wanted to have fetched.

So, fixing my app to use FetchGroups (an OpenJPA feature) to do what I had hacked together has been (and still) is a fairly painful process.

I first have to find each place that needs to have a custom fetch group defined and then add to the fetch group until it contains everything that I need.

Once I have it set up correctly though...It is a beautiful thing.

Fast, clean, and obvious what is going on.

Thursday, April 2, 2009

That was painful, but worth it...

Last week I found out about a feature that the OpenJPA folks put into their implementation of JPA. But, it is not part of the standard.

That feature is custom fetch plans - And, I think I can safely say that they are my newest favorite thing.

The reason that these are exciting for me personally is that I use JPA to pull data from my back end database that is then converted to XML using JAXB and sent to a browser for processing/display. If I were to transform a completely populated 'top-level' entity, then I would be creating an XML document that could be several Mb.

Over the past year, JavaScript engines have gotten faster - and continue to do so. But, trying to make JavaScript parse and manipulate blocks of XML data that are that big is not a nice thing to do to the browser (or the user).

Until this past week, I thought that I would need to create tailored versions of my JPA entities in order to send back just the part of the XML that I actually needed. Then, I found out about (Cue choir of angels) dynamic fetch plans.

What dynamic fetch plans do is allow you to specify which fields and relations are eagerly fetched at the time that you execute the query. That may not sound particularly earth shattering - but give it time to sink in.

You are able to specify down to the individual database column level exactly what will be pulled from the database (and/or specify the fetch depth). When doing JAXB processing, this allows me (and you if you need it) to tailor the exact data that will be turned into XML.

One poignant example of how this can clean up the XML sent to the browser is to send a list of top level entities for listing in a drop down. The only data that really needs to be sent back is the entity key and description. Without custom fetch groups, JAXB would try to build the entire fully populated entity tree for each entity. The amount of data being sent back to the browser would be insane! My previous solution of creating an array of JPA entity beans in order to define every possible grouping of desired fields was also insane. I picked a middle ground of an overly populous graph that small enough/big enough for most uses and a sparse graph that would be super quick but only useful in one or two cases.

Now, I have been able to remove all of the extra Entity beans. I have simplified the definition of my entity relationships and removed the possibility of missing changes if I change the actual table structures. Plus, I am able to exactly specify what I want to send back in each situation (anywhere from fully populated entities to single fields).

I just wish I had known about them two years ago when I first started using OpenJPA. It would have saved me a lot of refactoring and testing (at a time that I really can't afford to spend the time). But I am glad that I managed to 'stumble onto' them. I didn't really go looking for an EJB 3 entity manager - I just used the one that came with Geronimo. If I had shopped around, I might not have given this the weight it deserves.

And it is -huge-. Not only can you pare down an overly generous eager fetch - you can also expand an excessively lazy fetch - at runtime with very little programming cost.

By the way, the painful part is trying to undo two years worth of hacks to accomplish something that was included in OpenJPA - in five days.

I think I used to sleep - didn't I?

Wednesday, March 11, 2009

Can we talk...

I get to spend a lot of my time (most of my time actually) working on web apps.

Nowadays I use Geronimo for my server, Firefox for my browser, and Dojo for the two to talk to each other.

Last time (actually, the time before that) I showed how I parse the XML that I am sending from the browser to the server. But, I did not show how that XML gets created.

Well, since I like to make things simple for myself, I wrote a couple of wrapper functions to abstract the Dojo functions so that I would not have to rewrite large blocks of code if the syntax changes in Dojo (which it has done since I started). Also, having a single way to send messages to my server means that I can have a consistent way of parsing those messages.

I broke the sending of messages into two parts:
  1. Building the message
  2. Sending the message and specifying the callback
And go figure, there is a JavaScript function (that I wrote) to do each of those things.

For building the message, I created a function called 'queueCommand'. The idea was that you could create several message queues that all needed to be sent to the same servlet (and whose output would be consumed by the same callback function). I actually set up a number of functions and made it possible to build several message queues - but for simplicity, we'll pretend there is only one at a time.

Here is queueCommand:
function queueCommand(command) {
var commandXML = encapsulateCommand(command);

if (window.commandQueue === undefined) {
window.commandQueue = [];
}

window.commandQueue[window.commandQueue.length] = commandXML;
}

Pretty simple right? Ooops! You caught me. There is a third method that I neglected to mention. The encapsulateCommand function tries to remove characters that would cause the generated XML to be invalid as it puts together a simple (and standardized for my purposes) XML snippet.

Here is encapsulateCommand (with a helper function):
function encapsulateCommand(command) {
var nodes = "";

for (i in command) {
nodes += "<" + i + ">" + escapeString(command[i]) + "";
}

var commandXML = "" + nodes + "";

return commandXML;
}

function escapeString(inputData) {
var outputData = inputData;

if (typeof inputData == 'string') {
if (inputData != null && inputData != "") {
outputData = inputData.replaceAll("& ", "& ");
outputData = outputData.replaceAll("<", "<"); outputData = outputData.replaceAll(">", ">");
outputData = outputData.replaceAll("\"", """);
outputData = outputData.replaceAll("\\", "~1~");
outputData = outputData.replaceAll("%", "~2~");
outputData = outputData.replaceAll("\'", "~3~");
outputData = outputData.replaceAll("\n", "~4~");
}
}

return outputData;
}

And here is an example of how you would call it:
var pushCommand = queueCommand({
action: "doSomething",
fieldValue: dojo.byId('fieldID').value
});

I don't actually send back a return value. But, I do capture the result in a variable because if something goes wrong - receiving the result into a variable prevents the error from stopping program execution. If I were being more diligent, I would send back an actual result status (or maybe even the assembled command) - But I didn't do that.

So, when the above command (var pushCommand = ...) is executed, the following XML snipped gets added to the queue (shown here 'pretty'):

<command>
<action>doSomething</action>
<fieldValue>SomeValue</fieldValue>
</command>

The pushQueue function bundles up all of the commands that have been placed into the queue inside of a proper XML header and a 'envelope'.

Here is pushQueue:
function pushQueue(url, onLoad) {
var queue = window.commandQueue;

if (queue == undefined) {
window.commandQueue = [];
queue = window.commandQueue;
}

var xmlDoc = "";

xmlDoc += "";
xmlDoc += "";

for (var i = 0; i <>";

window.commandQueue = [];

var parser = new DOMParser();

var xmlDocument = parser.parseFromString(xmlDoc, "text/xml");

var ajax = dojo.rawXhrPost({
url: url,
postData: xmlDocument,
load: onLoad,
headers: {
"Content-Type": "application/xml"
},
handleAs: "xml"
});
}

You would call pushQueue with the URL of the servlet (or whatever resource is going to handle the message) and the callback function that should get the result.

Here is an example of calling pushQueue:
var sendTheMessage = pushQueue("/Handler", callBackFunction);

After that call, Dojo would send the following XML document to '/Handler':

<?xml version='1.0' encoding='utf-8' ?>
<commands>
<command>
<action>doSomething</action>
<fieldValue>SomeValue</fieldValue>
<command>
<commands>

Now, just in case you were wondering, there is nothing special about the 'command' and 'commands' tags that I used. They are really entirely arbitrary, but they make sense for what I am sending (a list of commands, each or which is a command).

And, when the result is sent back from 'Handler', it will send the result to my JavaScript function called 'callBackFunction'.

So, with a couple of functions placed in a JavaScript file that is included on all of my pages, I am able to have a standard way of sending messages to my server. And because the format of the messages is the same every time, I can use the same XML parsing code (shown on a previous post) to extract the information.

Days go by...

It doesn't seem like it has been over a month since the last time I posted.

But, the fact is that it has been (ack!).

So, back to the grindstone.

Friday, January 30, 2009

Goodbye to the old...

When I first started using Geronimo (the Apache JEE server), it was on version 'point something'. And, I needed to parse XML messages being sent from a browser to servlets.

So, I used JDOM. It was fairly simple to use and the JDOM libraries were included with Geronimo.

That made things very simple. All I needed to do to parse the XML messages was put something like this in the doPost method:

Reader reader = request.getReader();

try {
SAXBuilder builder = new SAXBuilder(false);
Document doc = builder.build(reader);
Element root = doc.getRootElement();


Then, to get values out of the document, I wrote a number of wrapper functions like getString, getLong, getX - to convert the values in the XML document into something useful (something other than strings).

But, when Geronimo went from version 1.x to 2.x, they dropped the JDOM library from the assembly - And I started to have to include it myself.

Recently, I finally got tired of having to add the library myself and started to look for alternatives to using JDOM.

Enter W3C...

So, now I get to change a bunch of servlets from the code above to this new version:

Document document = getXML(request, dbf);

Ok, so that wasn't quite all that I did. That 'getXML' function isn't included with the W3C's DOM classes. I had to write it myself. And, here it is:

protected Document getXML(HttpServletRequest request, DocumentBuilderFactory dbf) {
Document document = null;

try {
DocumentBuilder db = dbf.newDocumentBuilder();

BufferedReader in = request.getReader();

String input = "";
String xmlText = "";

while((input = in.readLine()) != null) {
xmlText = xmlText + input;
}

System.out.println("Received: " + xmlText);

InputSource source = new InputSource();
source.setCharacterStream(new StringReader(xmlText));

document = db.parse(source);
} catch (Exception e) {
System.out.println("Exception: " + e.getMessage());

e.printStackTrace();
}

return document;
}

Now, that is much bigger (as far as lines of code) than the old way. But, by putting it into a function - you can't tell. And, once I am done with all of the change over, I will be able to get rid of that extra JDOM dependency.

Plus, along with finding how to change over to use the W3C's DOM, I also found the XPath libraries. So, I was able to change all of my XML to use real, properly formed documents. What I had been doing before was storing all of the real information in attributes of a single element.

Bonus!

Take a deep breath...

Well - I have been slowly trying to get myself accustomed to the idea of social networking over the internet.

This blog was my first tentative step.

Then, I let myself get a twitter account - I don't plan on posting anything just yet. That may change.

But, I finally got around to reading other people's blogs.

And I realized something.

I have a long way to go to get my blog to be what I want it to be.

So hopefully this is a changing point. I hope that from now on, this will become a blog that someone else would actually follow.

Wish me luck.

Wednesday, January 28, 2009

Tools holding you back...

I have gotten too used to working with tools and skipped over learning how to build java artifacts by hand.
And now, I am starting from scratch on my new project.

Which isn't such a big deal. Using Eclipse, I started creating my WAR file - no problem.

But, I want to use Maven to manage the project parts. Including creating the database pools.

So, I finally -have- to learn how to use Maven my self.

Yay!

(No, really - Yay. Seriously).

Monday, January 26, 2009

Wait, I think I saw it move...

There actually has been (a very small) step forward.

I have one web service and one entity written.

(One down and how many to go?)

Now I have to put it into a maven project and start setting up unit testing.

And then write the rest of the General Ledger.

Piece of cake.

(Maybe pie - pie is a little trickier to make than cake).

Sunday, January 11, 2009

Nothing like starting small...

I had nearly forgotten what it was like to start from scratch.

Actually, I think that I did forget what it was like to start from scratch.

I have been programming professionally for twenty years now. And except for toy projects - only about three years of that time was spent working on projects that started in my head. Everything else has been upgrades and changes to significant code bases started by someone else.

And even the current focus of my professional work (which started in my own head) is two years since its start. Even it feels like working on an established code base. Many of the decisions that were made on design/implementation choices were just the farthest reach of what I knew how to do.

Now, I know a lot more than I did then - And that makes things more complicated when deciding which approaches I should use for AccSys.

Well, the going is slower than I thought it would be -

But it is going!

Finally (geeze).

It's time to start...

Well, I haven't really had a chance to do the kind of design that I hoped on my G/L - But, time keeps slipping by.

So, I'm going to go ahead and start work (using the semi-congealed design that is floating around in my head). And, as I have time, I'll continue working on the formal design.

(Note to self: I should really use this new beginning as an opportunity to do test driven coding - rather than trying to do all of my testing by hand after the work is finished).