Posted 3140 days ago
I like Jeff Atwood’s blog Coding Horror and while he mainly centers on developer type stuff he does write some very interesting and thought provoking posts. His post on the 20th was one of these and I think should be required reading for every single Web 2.0 developer out there.
In one simple paragraph he sums up one of the major failings of web development; well actually all development really:
And please, if you’re designing social software, try to avoid repeating the many mistakes of our forefathers. Again. Design from day one with the assumption that a few of your users will be evil. If you don’t, like Six Apart, your naïvite will make the entire community suffer sooner or later.
The interesting reading continued as I breezed through the comments posted to his very understandable look at the problem of trackbacks and how they are broken; and the one thing that I came away with from reading them is that on a certain level developers live in a rarified world of purity.
On one hand they can acknowledge that bad things can be done with software; web based social software or thick client applications, but on the other hand it is confined to a small segment of users. This maybe true but what is seemingly being missed here is the even if this “evil user” segment is small; which is questionable, their effects are magnified a hundred fold due to the very technology being used.
To develop anything; regardless of the technology, without taking into account the “evil quotient” that is a part of our society is only proving how much you live in a dream world; and thinking that your software would be of no interest to all those evil doers out there will come back to bite you on the ass.
Anything created by man can just as easily be corrupted by man and software is no different – just easier.
Posted 3190 days ago
I don’t know about you but the fact that parts; or all, of the software used in defense of one’s country is being written by offshore software companies who have no allegiance to the contracting country; other than the almighty dollar, bothers the hell out of me. In an article on BusinessWeek.com the two authors; Steve Hamm and Dawn Kopecki, report on a Defense Science Board task force that is looking into this very thing along with preparing a set of recommendations on how to deal with this trend.
Part of the article also mentions a possible correlation between this offshored software and the rise in “suspicious attempts” to hack US computer systems.
National security issues concerning the offshore development of software have been raised in the past. In 2001, foreign software companies and programs developed overseas were flagged as a bigger potential threat than domestic hackers when the Defense Security Service noticed a surge in “suspicious attempts” by foreign hackers to gain access to U.S. computer systems. Approximately one-third of those attacks were sponsored in some capacity by foreign governments, according to the Government Accountability Office. “We recognize that there are real threats,” says Phil Bond, chief executive of the Information Technology Assn. of America (ITAA), a tech lobbying group. “We want government to deal with this in a smart way, and we’re concerned they might do it wrong.” [Full Article]
Part of the complaint by the Pentegon is that by forcing them into a position of using homegrown custom application there would be a corresponding increase in the cost of the software. So better to farm out code chunks and allow them to save taxpayer monies is the accepted attitude.
So … it better that the military use software that could potentially contain “bombs” and in the processes watch as the number of unemployed American techs grow; all the while the corporate pockets bulge with a little less money than if they utilized the very able and willing American workforce.
Nothing like having your military available for the lowest bid.