Saturday, June 30, 2007

putting my money where my mouth is, is really messy


P1010341
Originally uploaded by cubertsc
Our heat and air system was installed in 1989. Being nearly 20 years old, it's not working very well and our ducting is so full of holes we're losing between 20% and 30% into the attic. To support our eco-friendly goals we're installing a geothermal exchange heat pump. Geothermal is one of the most environmentally friendly ways to heat and cool a house since it only uses electricity to run a water pump and a blower, but getting there is really messy.

We are installing two WaterFurnace ES030S1ANB units, coupled with Trane 4TEE3F31B1000A variable speed air handlers. We didn't choose these particular components, we chose the most reputable company in our area and this is what they recommended.

In our area (close to the water, sandy soil, with our average daily temperature) generally speaking you need one 250 foot deep well per ton of heating/cooling, and two tons of heating/cooling per 1000 square feet of living space. Our house is 2800 square feet so we ended up with six wells. These are being connected in a zoned system, with 3 wells servicing each of the WaterFurnace units. This is a closed loop system, meaning that there is no outflow of water. You can think of it like your refrigerator or the cooling system in your car. Instead of using air to cool things off this uses the ground.

Our lot is only 67 feet wide and in total the wells cover a space of about 10 x 20. The heating/cooling field will extend up to 20 feet out from that. The loops are joined together about 3 feet underground, then connected underground to the heat pump. When it's all done we will have two relatively small sealed boxes outside that house the water pumps and heat exchangers. This replaces the usual air conditioning compressor eyesore. Inside we will have two air handlers in the attic, suspended from the rafters so they don't vibrate the ceiling and cause noise. We will have no gas or electric powered supplemental system, all our heating and cooling will be done using only the electricity needed to power the water pumps and air handlers.

As for cost, generally speaking it's about 20% more than a conventional system, and you can expect a reduction in your utilities of 20% - 50%, depending on your usage. Our situation is a little different. In our case we're also having new ducting and new inside air intakes. That adds about $8,000 to the cost, and needed to be done regardless of the system we chose. Altogether we spent about $26,000 on the WaterFurnace units, the air handlers, the ducting and all the other interior work. The wells were another $11,000.

By way of comparison the company that installed and serviced our conventional heat pump quoted a little over $20,000 just to replace the conventional furnaces and compressors. They didn't see any problem with the ducting or air intakes. The main trunkline of the ducting was original to the house (1962ish) and in pretty bad shape. The air intakes weren't even connected to the air handlers so they were sucking in air from the attic instead of from the house.

So at the end of the day we could have gone with a conventional system for a little over $20K, added the $8K for ducting and the other work, and been done with it. Our total for the geothermal unit including all this extra work will be closer to $40K, and we expect to see about a $120/mo drop in our utilities.

It's a long ROI (27 years), but the upside is that there is one. With a conventional system the savings would have been so slight we never would have fully recouped the investment. The new system requires extremely minimal maintenance, it's done right by reputable people I trust, and it will theoretically last forever. The guy we bought the system from said his company has been installing these systems in our area for over 30 years and the only part he has ever replaced is a handful of water pumps. And I like to think we're earning some karma points by helping reduce our environmental impact and consume less fossil fuels. :-)

Side note: one additional point for us was that this only needs electricity to run a pump and a blower. At some point down the line I am considering adding a solar panel to handle that, so in the event of a hurricane we will still have heat and air even if the power is out for an extended period. Even without the solar panel, we were told we can run the entire system off a car battery for a day or so.

Updated 7/7/07 - Added details and fixed some parts that I got clarification on after talking to the guy doing the install.

Wednesday, June 27, 2007

Message recall: a comparison of Notes 8 to Outlook 2003

The most recent feature tempest in a teacup is around message recall in the upcoming Notes and Domino 8 release. Ed and Mary Beth, brave souls that they are, solicited feedback from the community. Through the ensuing discussion many people suggested that Notes 8 mimic Outlook 2003 as its default, just for the sake of user consistency.

That's honestly not possible because of inherent differences in how Notes and Outlook function. First, Outlook message recall is processed on the client rather than the server. Second, the amount of useful control administrators have over Outlook is a number approaching 0. Let's examine why these are relevant to message recall.

Message recall notices in Outlook are processed on the client, which means they are received into the message delivery folder. This poses two problems: first, if someone is on vacation or out for some other reason, the request won't be processed until they log into Outlook again. Second, and more importantly, Outlook users can configure their mail to be delivered to a personal folder. If the user has his mail delivered to a personal folder the message recall will never work. The person sending the recall request will get a failure notice and the original recipient won't even know the recall request happened. Finally, and to the second key difference listed above, as an administrator you can't force the mail delivery folder to be something you can control.

The lack of administrator control over Outlook is a huge problem. You have to use the Office Administration Kit to deploy custom installations, then use Group Policy Objects to further refine settings. Many options aren't even available to be set, and there is no ability to prevent a user from changing something that is set via policy.

As it relates to message recall, an Outlook user can select to not process recall requests automatically. If a user does this (and most do), there is no way for a recalled message to be automatically deleted -- even if the recipient has not read it yet. The recipient will get a request in his inbox and can choose to allow it or not, at his discretion. Administrators cannot prevent the user from making this change, either.

By contrast, Domino will process the mail recall on the server. There is no requirement that users log into their mail to make anything happen. Furthermore you can configure Domino so that it will always delete the message even if is has been read. Domino offers robust and granular server-driven control over what a user can do with the Notes client, so you can not only set the property for the user through a policy, you can prevent the user from changing it.

Nathan's writeup about how message recall "should" work is very good, and it sounds like something that would be worth implementing in the OpenNTF Mail Experience template. E.g. click recall and it presents a list of only the people in your current Notes domain. I'm not sure how much further it can be taken without adding in some server-side pieces to store metadata (does user allow recall, etc.). If there is enough interest and someone is willing to help me spec out all the possible pitfalls, I'm willing to code it. Any takers?

Monday, June 25, 2007

how not to structure a database


This is what I've been working on for the past three weeks: a circular relationship map for the database I'm supporting, reverse engineered using Embarcadero ER/Studio. There are no indexes other than primary keys defined and there are no foreign keys in any tables so the relationships have to be inferred from the stored procedures. This explains why a 5GB database with 120 users is running on Windows Server 2003 x64, dual dual-core CPU's and 8GB RAM.

Oh, and those tiny little blips in the upper left corner are views with no columns defined. I haven't figure out why they're in the database yet.

Sunday, June 24, 2007

Park Seed Flower Festival 2007


P6230286
Originally uploaded by cubertsc
Take 9 acres of test beds, plant them with over 2000 varieties of plants and what do you get? About 10,000 visitors! Park Seed is located in Greenwood, SC, about three hours north of Charleston. They are the largest catalog seller of seeds and plants in the US. Every year they sponsor a festival to showcase what's new for the fall and next year. It's a lot of fun if you're into plants like we are. It was a gorgeous day and the temperature reached nearly 90, but we had a blast. I'll be adding descriptions to my Flickr pictures to explain what some of the plants were.

We purchased a bunch of plants that are going into a new water feature area that's between our bedroom and the front door of the house. Progress pictures to come shortly. :-)

Wednesday, June 20, 2007

Imagining the Tenth dimension

The people in the #visualbasic channel on IRC scare me sometimes. They shared an animation concerning string theory and the 10th dimension. The general gist: every possible timeline for every possible reality can be expressed as a single dimensionless point. It's way more complicated than it sounds, check it out yourself.

Video: Imagining the Tenth Dimension

Tuesday, June 19, 2007

SNTT - Resolving a relative path

I recently needed to resolve a relative path, such as C:\Windows\..\autoexec.bat. Dir$ will tell you whether the path is valid, but what if you want to display the resolved path without the relative references? The secret (at least on Windows and in LotusScript) is the mostly undocumented shlwapi.dll library. Here's a few lines of code that will do it for you:
(Declarations)
Declare Function PathCombine Lib "shlwapi.dll" Alias "PathCombineA" _
(Byval szDest As String, Byval lpszDir As String, Byval lpszFile As String) As Long

Sub Click(Source As Button)
Dim sFullPath As String
Dim sResolvedPath As String

sFullPath = "C:\Windows\System32\drivers\..\..\..\autoexec.bat"
Msgbox sFullPath
sResolvedPath = Space$(255)
Call PathCombine(sResolvedPath, sFullPath, "")
sResolvedPath = Left$(sResolvedPath, Instr(1, sResolvedPath, Chr$(0)) - 1)

Msgbox sResolvedPath
End Sub
This LotusScript was converted to HTML using the ls2html routine,
provided by Julian Robichaux at nsftools.com.

Thursday, June 14, 2007

a month in my new job

Today marks my 30 day anniversary of changing jobs. I've been dismayed and overwhelmed by the amount of work to be done, and frustrated at the sometimes unrealistic expectations thrown at me. I just keep plugging away and doing what I can.

As far as the Microsoft environment itself... well I honestly think I'm in a poorly implemented one, and that's why it's been so frustrating. The guy I'm replacing was a brilliant security admin, a fair DBA, and a really really bad developer. So the security is draconian, the databases are poorly implemented, and the code makes me want to cry.

I've still not acquired a taste for Outlook (2003). I find it confusing and hard to find what I'm looking for. The toolbars are what drive me the craziest with their inconsistency. Trying to track down a full message thread has proven impossible. The interface is only customizable in predefined ways, so there's no going into your mail template and adding a button or changing the order of the left navigator. The calendaring is rudimentary at best, and resource scheduling is just about pointless since there's nothing on the server that checks resource availability.

On the app dev front I am still a fan of Visual Studio, but I'm bothered that deploying new versions of an application is an arduous affair involving a custom-written application update utility. I really got used to just updating a single file on a server and calling it done. Security is complicated, and pretty much an all-or-nothing affair. Doing granular per-user rights just isn't in the vocabulary of any Microsoft developer because it's such a huge pain to implement. Adjusting my way of thinking to that paradigm is proving incredibly difficult.

In a nutshell... I miss Notes. A lot. But I don't miss Domino Designer, even a little. Notes 8 is pretty, but I'm really waiting for Notes 9 with the updated Domino Designer. Maybe I can plug the holes with LCD in the meantime... In any case, for all the glitter on the surface this MS stuff is pretty rotten underneath.

Monday, June 11, 2007

help me choose a new digital camera

Way back in the stone ages of digital photography we bought an Olympus C-755 UZ. The main reason we settled on this model was the 10x optical zoom and super macro mode since we do lots of wildlife and plant photography. The biggest downsides are the tiny LCD on the back, the lack of image stabilization and the 4.1 MP resolution. Oh, and after several years of use (and abuse) the flash doesn't work consistently.

It's time to get a new one, and I've decided to go with a higher end camera. I'm looking for at least 6 MP, optical image stabilization and a zoom in the 300mm or 8x - 10x range. The option to add lenses would be nice, too. I don't want digital image stabilization, and I don't want a Sony camera with it's wacko nonstandard media.

So far I'm most seriously considering the following:

Nikon D40


Pros:
  • Digital SLR
  • Very highly rated
  • Wide array of lenses
Cons:
  • Potentially lots of lens changes or stacking/removing lenses
  • No autofocus motor or image stabilization built into the body
  • I'll have to buy lenses with these features, and they're quite pricey

Pentax K100D


Pros:
  • Digital SLR
  • Image stabilization and an autofocus motor built into the camera body.
  • Slightly less expensive than the Nikon D40
Cons:
  • Potentially lots of lens changes or stacking/removing lenses
  • Comparatively fewer high quality lenses available

Canon Powershot S5 IS


Pros:
  • Point and shoot = less complex
  • 12x zoom = 438mm FOV out of the box
  • Canon's image stabilization is legendary
Cons:
  • Not a digital SLR
  • Costs as much as the other two so I just feel like I'm getting ripped off somehow
They all three have the same basic form factor, so it's not like the S5 is any smaller. It boils down to how I see myself using the camera in the future. Am I more of a point and shoot kinda guy, or do I take my time to do composition and fully utilize the manual modes that are available? I've never had a fully manual camera so I'm honestly not sure. At this point I think it's something I would like to at least have as an option, and both the D40 and K100D seem to be very capable in fully automatic mode and offer the option to step out of the box a bit more. I just wonder about the price I pay for that flexibility that I may not use.

Does anyone out there have any suggestions?

Sunday, June 10, 2007

Greetings from the Eastern Shore of Virginia


P6020197
Originally uploaded by cubertsc
Last weekend Myron and I went the wedding of our friends Emmy and Jamin on the Eastern Shore of Virginia. Here's the pictures I took.

Thursday, June 07, 2007

blog stats

I've been thinking about posting this for a while and Ed's post jogged my memory. I'm about to go out insane from dissecting stored procedures and needed a break, so here's info from the past 30 days for this blog.

First up, Google Analytics browser stats. As a percentage Firefox has been stable, IE has been decreasing, and Opera and Safari have been increasing.


Next, Feedburner subscription stats. This has fluctuated wildly. At one point over 80% was Bloglines (which I've never even seen), at another point Madicon (which I've also never seen) was over 50%. Netvibes has been above 60% for the past few months.

Tuesday, June 05, 2007

mimicking Readers and Authors in SQL Server 2005

One of Notes and Domino's greatest strengths is the incredible security model. I've also come to appreciate how powerful it is to have that security be integrated with the application rather than having to rely on the Domino Directory. Another boon is being able to define readers and authors at the document level. Imagine if you will that Notes did not have Readers or Authors fields. Your only option for limiting access to Notes documents is solely at the view level, then relying on UI elements (computed framesets, computed embedded views, etc.) to control which view gets seen by each group of users. Ugly, right?

SQL Server 2005 does not natively implement row- or cell-level security, so that's exactly where you are in a MS development world. The security is only granular down to the table level, then you have to start using UI techniques to filter it and control access to data. And you're just praying that nobody realizes they can use Excel to connect to your database and bypass all your front end business logic, or you have to build in an interface layer that is aware of how the database is being accessed. All in all, it's pretty ugly to try to support the business use cases of letting people only edit certain records in a table.

It's not quite as bleak as I make it sound. Microsoft has published a set of instructions for hacking row-level security into SQL Server 2005. Basically it's roll-your-own authorization and while it's not horribly convoluted, it is tedious and does have to be reimplemented for every database where you want to use it. Oh, and there is no UI provided. You get to build that, and come up with a way to secure that, too. :-)

Monday, June 04, 2007

some observations

I've spent the past few weeks getting acquainted with ASP.Net 2.0 and Visual Basic 2005, via Visual Studio .Net 2005. In the course of this some of my previously held beliefs have been relinquished and others upheld. Let's start with the good.

MS Development Tools rock!


Holy crap this is awesome! So much stuff is just done for you, and the Intellisense is simply amazing. The entire .Net framework is Intellisense enabled, and it picks up all your custom classes too. And it can dynamically refactor, so if you change MyVariable to MyNewVariable, it updates it for you across your project. And the IDE just feels comfortable. I can find things relatively easily, unlike my experience in Eclipse where I was constantly searching for stuff.

ASP.Net 2.0 is very slick


I've only done a few basic things with ASP.Net but already I've accomplished more than I ever did with a Domino web app. I drag and drop like I would in a normal IDE. I double-click to generate event handlers. I don't have to go searching for various places to stuff code or know about arcane $$ syntaxed elements.

And now for the bad...

Cross platform?


Let's just be clear up front: you aren't going to easily create cross-platform applications using Visual Studio. That may be changing as Microsoft rolls out IronRuby and other dynamic language extensions, but it's not here yet. For now the fact remains that Microsoft steers you very strongly toward an all Microsoft solution.

Stepping outside the box is hard


Either you take what you've given or you pretty much have to rewrite it all from scratch. Case in point: when you use the nifty FormView component, which feeds SQL data into an HTML form, it includes a set of default templates for reading, updating, and adding records. If you don't like the way the default template looks it is extremely difficult to change it. You're better off recreating it from scratch.

The data access components that are provided via UI widgets are moderately functional but you will quickly run into limitations. It is better to code your own data access from the ground up. This, coupled with the previous limitation, means you'll be writing a lot more code than you may have anticipated. At least you'll be doing it in a very nice IDE.

Lessons learned (so far)


Trying to create an enterprise application that is entirely web-based just plain sucks. The stateless environment of the Internet is not well-suited to transactional data entry. Of course it can be done, but it's several orders of magnitude more everything than a rich client application: more code, more testing, and more server resources. After all that it's less functional than the comparable rich client solution.

For all the good stuff in Visual Studio, there are still some ugly bits and tremendous gotchas. If you rely on the IDE to do everything for you, you're going to end up with some marginal applications. This is one area where I think Domino Designer's lack of user friendliness is actually a good thing. You have to know what you're doing or you aren't even going to come close to a working solution. With VS.Net you can fake most of it by clicking and dragging. That's not to say I wouldn't like to see Domino Designer radically modernized, but there is something to be said for the "you know it or don't" way it is now.

So... which one is better? That's just too close to call for now. I'm not a fan of web-based enterprise applications, but I do think if I were headed down that path the ASP.Net route would win.