My Friend Alan Rocker and I often discuss ideas about technology and tradeoffs. Alan asked about SSDs for Linux:
> I haven’t been following hardware developments very closely for a while, so I
> find it hard to judge the arguments. What’s important?
Ultimately what’s important is the management software, the layer above the drivers, off to one side. That applies regardless of the media and means that the view the applications take of storage is preserved regardless of changes in the physical media.
> The first question is, what areas are currently the bottlenecks and
> constraints, at what orders of magnitude?
What’s the saying “Those who forget history are doomed to repeat it over again“?
Weren’t we doing this with routers and … well if not firewalls as such then certainly filtering rules in the routers, way back in the 1980s?
I recall attending a luncheon put on by Dell about “Software Defined networking“. Basically it was having routers that were ‘agile’ enough to change routing and implement tactical policy by load, demand and new devices or devices making processing demands.
Again we were doing that in the 1980s. Working with ANS as they cut over the academic internet to the commercial internet with their “CO+RE” pseudo-product. basically it was that they had been supporting the academic internet and were not selling commercial services using the same backbones, trunks and “outlets” (sometimes known as ‘point of presence’). This ‘policy based routing’ was carried out by custom built routers; they were IBM AIX desktop boxes — the kind I’d used to implement an Oracle based time management/billing system for at Public Works Ottawa a few years earlier, along with some custom built T3 interface cards. Continue reading Everything old is new again
For whatever value of “Mobile” is applicable in context, yes.
A lot of what I see is students in the library with their laptops or large tablets_keyboards with paper and books beside. Perhaps if students had the multi-screen displays like the one in the movie “Swordfish” AND there were more books on-line at low cost and multi-access (which isn’t how many libraries work, sadly) then the marketers dream of students with ebooks rather than a knapsack of books would happen. As it is, with only one viewer, books and papers are still needed. Continue reading We’re mobile addicts but we just don’t want new smartphones
I have my doubts about many things and the arguments here and in the comments section loom large.
Yes, I can see that business sees no need for an ‘arms race’ escalation of desktops once the basics are there. A few people, gamers, developers, might want personal workstations that they can load up with memory and high performance graphics engines, but for the rest of us, its ho-hum. That Intel and AMD are producing chips with more cores, more cache, integrated graphics and more, well Moore’s Law applies to transistor density, doesn’t it, and they have to do something to soak up all those extra transistors on the chips.
As for smaller packaging, what do these people think smart phones and tablets and watches are?
Gimme a brake!
My phone has more computing power than was used by the Manhattan project to develop the first nuclear bomb.
These are interesting, but the real application of chip density is going to have to be doing other things serving the desktop. its going to be
And for #1 & #3 Windows will become if not an impediment, then irrelevant.
Its possible a very stripped down Linux can serve for #1 & #3, but somewhere along the line I suspect people might wake up and adopt a proper RTOS such as QNX much in the same way that Linux has come to dominate #2. It is, however, possible, the Microsoft will, not that Gates and Balmer are out of the scene, adopt something Linux like or
work with Linux so as to stay relevant in new markets. The Windows tablet isn’t the success they hoped for and the buyout of Nokia seemed more to take Nokia out of the market than become an asset for Microsoft to enter the phone market and compete with Apple and Samsung. many big forms that do have lots of Windows workstations are turning to running
SAMBA on Big Iron because (a) its cheaper than a huge array of Windows Servers that present reliability and administrative overhead, and (b) its scalable. Linux isn’t the ‘rough beast’ that Balmer made out and Microsoft’s ‘center cannot hold’ the way it has in the past.
Embedding such devices in something edible only means it will end up in the stomach of the targeted user. Perhaps that is intentional, but I suspect not. Better to put the device in the base of the coffee cup.
If I plug in an IDE drive or a SATA drive or a USB drive or device my mobo or system recognizes what it is. The connection protocol tell the mobo or system.
My digital camera uses exif to convey a vast amount of contextual information and imprint it on each photo: date, time, the camera, shutter, aperture, flash. I have GPS in the camera so it can tell the location, elevation. The exif protocol also allows for vendor specific information and is extensible and customizable.
Unless and until we have an ‘exif’ for IoT its going to be lame and useless.
What is plugged in to that socket? A fan, a PC, a refrigerator, a charger for your cell phone? What’s the rating of the device? How is it used? What functions other than on/off can be controlled?
So, the two-way tv sets of Orwell’s novel have arrived, over a quarter of a century late!
It just goes to show. Science fiction things like the Star Trek communicator (Motorolaflip phones) or the tricorder (some of the enhanced versions of the Newton) or the data Pad (the real world version has an extra ‘i’) we do pretty quickly, but if its a mainstream novel, the kind of thing that my old Eng Lit teacher would approve of (he snivelled at SF and cringed at its mention) then it seems three isn’t the same enthusiasm about replicating its technology.
The Navy’s premier institution for developing senior strategic and
operational leaders started issuing students Apple iPad tablet
computers equipped with GoodReader software in August 2010,
unaware that the mobile app was developed and maintained by
a Russian company, Good.iWare, until Nextgov reported it in February.
OK so its not news and OK I’ve posted about this before, but …
Last week I was reading another report about malware and it stated that most malware yamma yamma yamma had it origins in the USA. No doubt you’ve seen reports to that effect with different slants.
What’s interesting here is that this isn’t preaching “The Cloud” and only mentions VDI in one paragraph (2 in the one-line expanded version).
Also interesting is the real message: “Microsoft has lost it”.
Peter Drucker, the management guru, pointed out that the very last buggy-whip manufacturer in the age of automobiles was very efficient in its processes – it *HAD* to be to have survived that long. (One could say the same about sharks!)
“Keeping desktop systems in good working order is still a labour of Sysiphus ..”
Indeed. But LinuxDesktop and Mac/OSX seem to be avoiding most of the problems that plague Microsoft.
A prediction, however.
The problem with DOS/Windows was that the end user was the admin and could fiddle with everything, including download and install new code. We are moving that self-same problem onto smart-phones and tablets. Android may be based on Linux, but its the same ‘end user in control’ model that we had with Windows. Its going to be a malware circus.