I have my doubts about many things and the arguments here and in the comments section loom large.
Yes, I can see that business sees no need for an 'arms race' escalation of desktops once the basics are there. A few people, gamers, developers, might want personal workstations that they can load up with memory and high performance graphics engines, but for the rest of us, its ho-hum. That Intel and AMD are producing chips with more cores, more cache, integrated graphics and more, well Moore's Law applies to transistor density, doesn't it, and they have to do something to soak up all those extra transistors on the chips.
As for smaller packaging, what do these people think smart phones and tablets and watches are?
Gimme a brake!
My phone has more computing power than was used by the Manhattan project to develop the first nuclear bomb.
These are interesting, but the real application of chip density is going to have to be doing other things serving the desktop. its going to be
And for #1 & #3 Windows will become if not an impediment, then irrelevant.
Its possible a very stripped down Linux can serve for #1 & #3, but somewhere along the line I suspect people might wake up and adopt a proper RTOS such as QNX much in the same way that Linux has come to dominate #2. It is, however, possible, the Microsoft will, not that Gates and Balmer are out of the scene, adopt something Linux like or
work with Linux so as to stay relevant in new markets. The Windows tablet isn't the success they hoped for and the buyout of Nokia seemed more to take Nokia out of the market than become an asset for Microsoft to enter the phone market and compete with Apple and Samsung. many big forms that do have lots of Windows workstations are turning to running
SAMBA on Big Iron because (a) its cheaper than a huge array of Windows Servers that present reliability and administrative overhead, and (b) its scalable. Linux isn't the 'rough beast' that Balmer made out and Microsoft's 'center cannot hold' the way it has in the past.
At the very least, this will apply a 'many eyes' to some of the SSL code and so long as the ssh pruning isn't wholesale slash-and-burn that cutting it back may prove efficacious for two reasons.
Less code can be simpler code, with decreased likelihood of there being a bug due to complexity and interaction.
Getting rid of the special cases such as VMS and Windows also reduces the complexity.
This isn't news. Signature-based (and hence subscription based and hence that whole business model) AV is a wrong headed approach. As Rob Rosenberger points out at Vmyths.Com, we are addicted to the update cycle model and its business premise is very like that of drug pushers.
What's that you say? Other types of AV? Like what?
Well, you could have a front-end engine that checks all downloads and all email and all email attachments and all URL responses by emulating what would happen when they run on any PC or in any browser or any other piece of software such as any of the PDF readers you use, or any of the graphical display software you use or any of the word processors you use
or any of the spreadsheet programs you use or any music players you use ... and so on.
Many people in the industry - myself included - have proposed an alternative whereby each machine has a unique cryptographic ID and the legally and properly installed libraries are all signed with that ID, and the program loader/kernel will only load and execute correctly signed code.
Yes, Microsoft tried something similar with ActiveX, but that was signed by the vendor - which can be a good thing, and used PKI, which can also be a good thing. But both can be a problem as well: go google for details. A local signature had advantages and its own problems.
The local signature makes things unique to each machine so there is no "master key" out there. If your private key is compromised then do what you'd do with PGP - cancel the old one, generate a new one and sign all your software with the new one.
No technical measure can overcome human frailty in this regard.
- Avira antivirus upgrade wreaks 'catastrophic' havoc on Windows PCs (techworld.com.au)
- How can We Detect Viruses Without Antivirus Software? Built In Antivirus in your Browser 🙂 (shanicomputers.wordpress.com)
- Intel and McAfee unveil plans for unified security future (go.theregister.com)
- John McAfee, antivirus pioneer, arrested by Belize police (networkworld.com)
- GlobalSign Develops Free Tool to Simplify Code Signing Process (prweb.com)
- A Modest Proposal: Please Don't Learn to Code Because It Will Damage Your Tiny Brain (inventwithpython.com)
- Why Authenticity Is Not Security (leviathansecurity.com)
- Certs 4 Less Announces Support For Individual Code Signing Certificates (prweb.com)
- 'Catastrophic' Avira antivirus update bricks Windows PCs (go.theregister.com)
- Avira fixes antivirus update that crippled many PCs (neowin.net)
- Free Anti-Virus Software Fails To Charm Enterprises (informationweek.com)
- Backpack Algorithms And Public-Key Cryptography Made Easy (coding.smashingmagazine.com)
- Cryptography pioneer: We need good code (infoworld.com)
- Contrary to Popular Opinion, Encryption IS the Hard Part (blogs.gartner.com)
- Public Key Cryptography Explained (q-ontech.blogspot.com)
What's interesting here is that this isn't preaching "The Cloud" and only mentions VDI in one paragraph (2 in the one-line expanded version).
Also interesting is the real message: "Microsoft has lost it".
Peter Drucker, the management guru, pointed out that the very last buggy-whip manufacturer in the age of automobiles was very efficient in its processes - it *HAD* to be to have survived that long. (One could say the same about sharks!)
"Keeping desktop systems in good working order is still a labour of Sysiphus .."
Indeed. But LinuxDesktop and Mac/OSX seem to be avoiding most of the problems that plague Microsoft.
A prediction, however.
The problem with DOS/Windows was that the end user was the admin and could fiddle with everything, including download and install new code. We are moving that self-same problem onto smart-phones and tablets. Android may be based on Linux, but its the same 'end user in control' model that we had with Windows. Its going to be a malware circus.
- eWEEK Review: Unidesk Simplifies VDI Deployment and Management (prweb.com)
- Dell Delivers Desktop-as-a-Service (informationweek.com)
- Zenk GmbH to Distribute Unidesk VDI Management Software in Germany (prweb.com)
- The key questions you must ask to save your virty desktop dream (go.theregister.com)
- 6 Common Desktop Virtualization Mistakes (informationweek.com)
- 5 Best Alternatives of Windows 8 (indianbloggist.com)
Apparently (ISC)2 did this survey ... which means they asked the likes of us ....
Faced with an attack surface that seems to be growing at an overwhelming rate, many security professionals are beginning to wonder whether their jobs are too much for them, according to a study published last week.
Right. If you view this from a technical, bottom-up POV, then yes.
Conducted by Frost & Sullivan, the 2011 (ISC)2 Global Information Security Workforce Study (GISWS) says new threats stemming from mobile devices, the cloud, social networking, and insecure applications have led to "information security professionals being stretched thin, and like a series of small leaks in a dam, the current overworked workforce may be showing signs of strain."
Patching madness, all the hands-on ... Yes I can see that even the octopoid whiz-kids are going to feel like the proverbial one-armed paper-hanger.
Which tells me they are doing it wrong!
Two decades ago a significant part of my job was installing and configuring firewalls and putting in AV. But the only firewall I've touched in the last decade is the one under my desk at home, and that was when I was installing a new desk. Being a Linux user here I don't bother with AV.
"Hands on"? Well yes, I installed a new server on my LAN yesterday.
No, I think I'll scrub it, I don't like Ubuntu after all. I'm putting
in Asterix. That means re-doing my VLAN and the firewall rules.
So yes, I do "hands on". Sometimes.
At client sites I do proper security work. Configuring firewalls, installing Windows patches, that's no longer "security work". The IT department does that. Its evolved into the job of the network admin and the Windows/host admin. They do the hands-on. We work with the policy and translate that into what has to be done.
Application vulnerabilities ranked as the No. 1 threat to organizations among 72 percent of respondents, while only 20 percent said they are involved in secure software development.
Which illustrates my point.
I can code; many of us came to security via paths that involved being coders, system and network admins. I was a good coder, but as a coder I had little "leverage" to "Get Things Done Right". If I was "involved" in secure software development I would not have as much leverage as I might have if I took a 'hands off' roles and worked with management to set up and environment for producing secure software by the use of training and orientation, policy, tools, testing and so forth. BTDT.
There simply are not enough of us - and never will be - to make security work "bottom up" the way the US government seems to be trying We can only succeed "top down", by convincing the board and management that it matters, by building a "culture of security".
This is not news. I'm not saying anything new or revolutionary, no matter how many "geeks" I may upset by saying that Policy and Culture and Management matter "more". But if you are one of those people who are overworked, think about this:
Wouldn't your job be easier if the upper echelons of your organizations, the managers, VPs and Directors, were committed to InfoSec, took it seriously, allocated budget and resources, and worked strategically instead of only waking up in response to some incident, and even then just "patching over" instead of doing things properly?
Information Security should be Business Driven, not Technology Driven.
 Or devolved, depending on how you look at it.
- Information Security By the Numbers (michaelpeters.org)
- Malware in Medical Equipment Poses Serious Threat to Hospital Security (eweek.com)
- Re: CISO Challenges: The Build vs. Buy Problem (1:2) (h30499.www3.hp.com)
- Information Security Awareness Through Analogy (clerkendweller.com)
You betcha its not!
There are GOOD practices for deploying SNMP.
The BEST practice is to avoid V2.
If you must SNMP then use v3
if you are feeling geekish.
However my personal view is DON'T DO IT.
I saw this assertion go by and it stood out:
The bigger cost would be the cost of not patching. Such items as downtime will affect more staff/users than patching will.
The issue so far has been black and white.
There is a black and white difference between devices that face the internet and those that are not accessible to or from the 'Net.
But what about the "grey"? No all patches have the same criticality even for 'Net-facing devices.
And there's more to security - even of the Internet-facing devices - than patching software.
A short while ago I read an article that tried to present both sides of the issue of whether companies should shut down their desktop machines at night.
The 'pro' was of course the saving of electricity - all good and "Green".
The 'con' was that this saving would be offset by the cost in time as employees waited for the machines to book and waited while they shut down - the latter to make sure that they didn't hang.
The article didn't discuss home users. I'm sure home users would appreciate the savings and be willing to devote the time 🙂 While many people work from home and many children use computers from home, I don't think there is a need for an 'always on' computer in the home.
(Unless you count the fridge or the microwave or the VCR clock ..)
Would turning those computers off affect that botnet? Perhaps. I've certainly met people who when they learn I'm involved with IT ask me why their computer runs slower than when they bought it. I ask if they run AV or other anti-malware software, purge adware ... I rarely hear from them again but when I do its to say that some tool like "Search-and-destroy" told them they had gazillions of malware. And they ask me where it comes from.
I don't know, I run Linux.
But that argument against turning off corporate machines is specious at many levels. Most of the staff at my clients seem to use laptops rather than desktop machines. They take them to meetings and presentations, sometimes they take them home. All this involves turning off and on. If they don't take them home at night those laptops have to be locked away, not left on the desk top. That's been policy everywhere I've worked this last decade.
The limiting case was one year I worked in a port-a-kabin.
The sub-zero overnight temperatures meant none of the workstations were operative. So we turned on the cabin heating all the electrics, all the machinery and went to get a coffee (aka "breakfast"). Half an hour later the cabin was warm enough for the electronics to operate. We were not allowed to leave the cabin powered up overnight.
Would shutting down the home machines each night reduce the level of spam? Perhaps. That's an incentive over and above the Green one of saving electricity. Perhaps some service provider service technician should recommend this over and above regular 'purges'.
The McAfee report doesn't make a clear distinction between commercial and residential hosts for the botnets, though it does mention some government agencies and banking institutions in Russia are
malware-laden. The large corporations that make up my clients have always had IT departments that support good front-end filtering and making sure that the workstations have up to date AV software. That being said, I see a lot of people who turn off their AV software. Myth or not, many still believe it affects performance.
Of course I run Linux and I don't have to worry about rogue ActiveX, and I don't run attachments I get in the mail and there are many sites I simply don't visit!
And I turn my home machines off at night.
Related articles by Zemanta
- How to Detect and Prevent Psyb0t, the Linux Router Worm (slumpedoverkeyboarddead.com)
- McAfee: Enabling Malware Distribution and Fraud (readwriteweb.com)
- Spam 'produces 17m tons of CO2' (news.bbc.co.uk)
- OS X 'pirate' trojan resurfaces (vnunet.com)
- Conficker virus begins to attack PCs (canada.com)
I do a bit of work on the fringe of the Ruby community, and the Mac is popular there along with an IDE or two. However I'm beginning to see a few articles to the effect that the IDE is getting in the way after a point and that reverting to your favourite text editor as an IDE is actually more productive.
For old-farts like myself that would be VI (or VIM). Such a comment will probably bring cries of derision, more so than the idea of an editor replacing an IDE. But after a few decades editing is no longer a conscious act. Just as some people touch-type and the words appear on the screen (or paper) without any thought about the mechanics, so too with your favourite editor - only it extends to the non-alphanumeric keys too.
Of course I cheat; VIM has panels and Linux has all these windows and other things that make VIM usable as an IDE. Integrated? Yes, in my head. Its the best place for it.
Related articles by Zemanta
I could go through a litany of complaints I have about Linux. I could
complain about the confusing number of distributions. I could complain
about the propensity of Linux proponents to cause unnecessary confusion
by abbreviating or using acronyms for Linux-only functions. I could
complain about the silly confusing names they give applications.
How come Linux gets berated for this?
There's a plethora, a confusing plethora, of Microsoft products, since, compared to Linux, that world is unbundled.
But Microsoft aside, look at the auto industry; it was once said that you could order over a quarter of a million different variations given the options on some Chrysler models. There are still many distributor/vendors, and different dealers/outlets offer different deals, trade-ins, offers and options. The auto industry has more acronyms than the computer industry and lots of special functions and tools.
For example, the spring inside my seat-belt buckle slipped out of place so that the buckle wont lock the clip in place. The way the buckle is built you can't take it apart, so the whole assembly has to be replaced. The bolt that fastens it into the seat assembly (remember, the seat has to be able to gyre and gymble without altering the tension of the belt, so the belt is bolted to the seat, not the frame of the car) is a special one, the only one (except for the other seat belt) in the car. Of course it take a special tool. As it turns out, the tool costs more than the over-priced replacement seat-belt assembly. And since it is for that purpose only on that model series (apparently it was changed for another equally unique bolt and matching tool in later models) my mechanic did not have that tool in in his toolbox. He tells me that this is normal, that the auto manufacturers have any twists and turns like this that serve to lock out the independent mechanic by forcing up the cost of operations.
I look at the computer industry and think how easy it actually is to move between vendors of hardware and software. I really can't see why if you are an office worker familiar with MS-Word you will be unable to do any work if faced with OpenOffice - or WordPerfect or WordPro. Once upon a time both Apple and Microsoft "sold" the GUI interface as being something that was "obvious" and wouldn't need training and thick documentation. Whether or not that's so, moving from one word processor to another, one mail user interface to another, has nothing to do with the underlying OS or the names and acronyms used.
As the article says:
An operating system exists only to create an environment for
applications; nothing more, nothing less. Most people sit down at a
computer and just start using it without worrying about what operating
system it is running.
So why the fuss? Gnome and KDE have "skins" that can make them look like OSX or any of the Microsoft Operating systems. The various distributions of Linux are more like the various offerings of the auto industry, they mostly resemble each other and copy ideas from one another. If you can drive a Ford - sorry, SUSE - you can drive a Chrysler - sorry, Mandriva. Or even a Volvo/BSD. And since I've seen Americans cope in England after just a few minutes, I'll add MGB/LinOS.
So Why Linux?
The article has a theme about moving from Windows to Linux. What it doesn't touch on is why one might want to move.
The reason for most people is that they get a new computer. They are probably going to have to change OS - from W/95 or W/XP to Vista. This is likely to be even more traumatic than if they changed to Linux with an appropriate skin. I've certainly seen many reports of application-only users who had their system "regressed" from a Vista they didn't like to to their "old" system which was actually Linux looking like XP. The reality is that most users see the applications and neither see nor want to see the OS. The same applies for most car drivers. They just want to drive.
When Mark Kaelin says that John Sheesley can crash Linux over and over - so what? The issue isn't that someone with John's background and expertise can crash Linux, its how stable Linux is for an ordinary user. And compared to Windows, it seems to be about 15 years further down the road. Windows seems to emphasise 'dressing'. Perhaps that's why Mark Shuttleworth wants to address the image of the desktop.
Its worth reading some of John's articles - he's not rabidly anti-Linux. Or rabidly anti-Microsoft.
When Mark points out that viruses and malware exist for Linux he omits to note that these are 'proof of concept' things that neither exist nor could exist in the wild. The underlying architecture of Linux makes it more resilient to whole classes of malware. The idea that its 'immune' because it doesn't have the market share is a myth.
I've asked many people in the business world why they don't use Linux, and all in all their reasons tend to be emotional not logical.
But to be fair, if security and reliability and security are deciding issues, as many Linux enthusiast claim, then why aren't they using BSD? I ask that of them and I get an emotional response similar to the one I see when I ask Windows enthusiasts about Linux.