The True Cost of ATG's Core Based Licensing

Modern CPU cores tripled licensing costs while performance gains lag. ATG's core-based model is economically broken.

This is a follow up from a post I made 8 months ago: Why ATG’s Core Based Licensing is Stupid
With the new Westmere hex-core CPU's out now, the problem has gotten worse.  A mid-high or high end Westmere CPU presents as 12 cores.  So what does this really mean?


I just ran the numbers, and basically a mid-high end single CPU server in 2008 (Xeon 5450) would cost me 4 ATG cores worth of licensing, and would handle X amount of traffic.


A mid-high end single CPU server in 2010 (Westmere 5650) would cost me 12 ATG cores worth of licensing, and will only handle X+35 to 70% traffic (based on published SPECint, SPECint_rate, and SPECfp scores for the CPUs).


So it's a 300% increase in costs to handle 35 to 70% more traffic.  Or just to provision with modern hardware.  That's crazy.

No comments yet