As if that’s not enough, the *process* of documenting something can itself be a valuable activity. Authors are forced to think through what they are trying to say, get their thoughts in order, structure the content, and generally make the effort to express themselves. They often need to research, reading and building upon other works.
These days, collaborative approaches bring authors, researchers, critics and creatives together, pooling and feeding off each others’ knowledge, expertise and creative energies. It’s great fun when the team gels.]]>
> “I microchip my dog, why couldn’t I microchip my son?”
As pointed out, a dog is property, a son is not. Rights to property are almost absolute; rights to other persons have been pretty limited for a while now.
I got my cat through my local animal shelter. They require a microchip. That way, when they find the cat dead in the street, they can scan it, find the adoption records, and notify me of my loss. I don’t think that is as helpful for children.
My neighbor cat wears a fat collar — and it has a cellular uplink that allows my neighbor to track where the cat is during the day. It is interesting, but again of limited utility, since in general the cat wouldn’t be at an easy place to recover them from (I think the granularity is around 50m) but again if it stopped moving for an extended period of time, it might be helpful to recover the body.
But really, I think for a child the point wouldn’t be to prevent kidnapping but to track movements; I think parents protection rights trump a child’s privacy rights. On my cell phone, I have an app, “GPSlogger” that will periodically wake up and send a GPS trackpoint to file or remote server. I used local files, but it would be trivial to use it to know where the phone was. So give the phone to the child; if very young, sew it into a pocket, and if older tell them to keep it with them. There might be compliance issues (hey, would you hold my phone for me? I want to go over to the wrong side of the tracks…) but that is a helpful thing between parent and child.
So implanting just means you can’t trust the phone to stay with the child — that the child is so willful he would reject your control (in which case you have a far bigger problem) or that he would be forcibly separated from the phone. As noted, child abductions are very rare, and in most cases the child is killed within three hours of the abduction. So again, we are back to making it easier to notify me of my loss.
> Heck a well meaning ‘kid’ might ‘clean it out’ ignorant of the special reason it was like that!
What I find is that well meaning ‘kid’ is myself six months or a year later — why the heck did I write it like that? Oh, well, it was probably just something awkward… and then several days later (and hooray for RCS) I figure out that it *had* to be that way. So yeah, TiddlyWiki for ongoing microcontent aggregation, and extensive code documentation is the only way to keep from wasting a lot of time.]]>
I don’t follow the Quirky product line, but I think its based upon the Zigbee standard. Layers 3 and down are an extension of 802.15.4 combined with a mesh (broadcast all routes) topology. At the application layer, it has “device objects” that provide the kinds of profiles that you are asking for — the protocol tells anyone else in the mesh that asks what kind of device it is and basic parameters about the system.
You seem to be observing that legacy applications — that fan or desk light — are just using raw AC, and aren’t communicating anything over that protocol. It seems a little unfair to complain about legacy designs from the turn of last century, when even ENIAC was just a gleam in its father’s eyes. (And for what its worth, raw resources do have an advantage — c.f. “network neutrality”.
I think we do have naming standards for the IoT. There are still too many of them, perhaps, but they are there.]]>
“The cold war era also had one additional factor that we don’t see in modern software.
ts been pointed out that modern software is commercial and driven by marketing and is about ‘features’ (and glitz). Cold war designs were based around a paranoid world view: “they” were out to get us; “they” were planing spies and saboteurs (read malware/trojans); they were seducing our scientists & politicians (think “Manchurian Candidate” and social engineering). While many security droids feel they are “paid to be paranoid”, not all those who claim to be in the security business and few if any of the wannabes who think it will be a cool career have that “healthy paranoia” that the job demands. “
It’s a question of risk aversion vs entrepreneurial perspectives. Appreciating both sides of the coin enables both parties to appreciate the issue at hand more realistically, I reckon.
It starts by highlighting a US-CERT vulnerability note, and observes as well that Java is pervasive, and that one needs to be careful about disabling java. But then, the US-CERT advice is ‘we are currently unaware of a practical solution’ and suggests disabling java in a browser as a workaround. This seems correct and valid advice.
It goes on to ask ‘are we fighting a losing battle?’ I’m not sure what the reference to AV products means, as no product has been mentioned to date. I don’t get the digression into ‘bugs’. The observation seems to be that both technical and novice users may have bad habits, which is of course correct but hardly novel.
The next turn is more interesting. It points out that we are increasingly dependent upon web based systems, and so disabling java in a browser is having increasing impact upon user activity. This is important, I think, but equally important is that we are blurring the ‘my computer, your service’ model — we are increasingly dependent upon external providers for *all* resources, not just external data. This has a profound impact upon the trusted path, and should call into question all risk assessment evaluation; increasingly, we are not able to mitigate risk, but depend upon risk transference — while at the same time these players treat security as externalities, and do not accept the transferal of risk. It is a puzzle.
But I guess I don’t understand the subject of this article.]]>
I used to be a professor, so I’m biased in favor of public disclosure and widespread education. But it is far more than short-sighted. It is the same slippery slope that offering knowledge risks. It is the aspect of the dark side that leads to blocking the Hugo awards ( https://www.ustream.tv/blog/2012/09/03/hugo-awards-an-apology-and-explanation/ ).
I would also like to point out that in grad school I had a friend who had a portion of his PhD thesis classified, preventing him from publishing. (This was in the early days of public key crypto, and this wasn’t an exploit, just insight into how things worked.)
Certain kinds of training *are* harmful. Certain kinds of tools to protect us from ourselves have unintended consequences which can be dire.
This is not easy.]]>
Not actually caring very much about infosec might be another. That’s reflected in the pay levels, which don’t match the specifications.
There’s rarely a real shortage of people for any kind of job. Usually it’s a shortage of cheap people.]]>
There an old saw that goes “What do you call the guy that came in last in class in medical school?” The answer: “Doctor”.
Yes, there are situations where the analogue value counts, but that’s not the point I was making.
Consider a annual race such as the Iditarod Race (See http://iditarod.com/race/) It may happen that one year, because of conditions, the “winner” has a time significantly poorer than last year’s winner. Does that mean that he hasn’t won the race? In the Olympics and track events its common for a winner *not* to break the world, record, that is to have a time worse by an analogue measure, than the record. That doesn’t stop them taking the medal.
Winning is binary, not analogue.
The timing may be analogue but that’s another matter.
Would you accept that there’s a difference between “barely scraping a pass” and “passing with flying colours”? Both are a 1 on the binary compliance scale, but rather different on the analogue real-life scale.
Suppose the pass-mark is 80%. Well you’ve either passed or you haven’t. Yes that 80% may be the sum of individual questions; but each of those questions was answered either correctly or incorrectly. Such is the nature of tests.
I’ve just had this debate in another form. A younger engineer seemed to think that things like FFT filters and auto-correlators could only be digital (though he did admit that our ears did that as analogue in some way he didn’t understand). I had to explain to him that all my first exposure to signal processing was analogue and I only learnt digital methods later.
Yes, out there in the real world, things are ‘continuous’, but I was talking about compliance and tests. You either pass a test or you don’t.]]>
PS Having only just finished reading it, I’ll publish a review of Kevin’s book soon, probably at http://www.NoticeBored.com]]>
Many real-world assessments involve pass-marks or hurdles or qualifying levels, which are arbitrary, very seldom at 100%. Furthermore, regarding the pass-marks themselves, there’s ambiguity and leeway in almost all real-world situations, for very good reasons (e.g. variations in factors, inaccuracies in measurement, and to give the assessors some latitude to “take things into account”). The alternative – hard and fast, absolute, strictly applied rules – may suit your binary preference but can create anomalies and inappropriate outcomes under some real-world situations.
Get real! Embrace ambiguity (to some extent)!