Monthly Archives: March 2004

Vonage To Offer Wi-Fi VoIP Phones

Vonage To Offer Wi-Fi VoIP Phones. Vonage said that it will start offering VoIP phones that operate on Wi-Fi networks later this year: The phones will let subscribers make and receive phone calls when in range of a Wi-Fi network, either in their homes or public places. A Vonage exec said the move is in response to AT&T's announcement that it is offering VoIP in Texas and New Jersey. The market for VoIP services is definitely getting crowded so anything that can differentiate a service will help. Vonage has a market leadership position as an early VoIP provider so it makes sense for it to be a leader in extending VoIP to Wi-Fi networks. I suspect that VoIP Wi-Fi phones will be most useful in homes or businesses. Users will be disappointed if they hope to carry the phone around everywhere and expect it to work like a cell phone. The phones will only work in very limited areas outside of the home and office. Vonage and any other provider that offers such a service will have to be very careful how they market it. Vonage doesn't seem to have issued an official announcement with any more information about the service…. [Wi-Fi Networking News]

HOW-TO: VNC secure tunneling using Windows PuttY ssh client.

HOW-TO: VNC secure tunneling using Windows PuttY ssh client.. Quote: “Secure connection to remote desktop using open source VNC or Tight VNC software (that is normally doesn't encrypt traffic other than password). As a “side-effect”, shorter response times achieved due to ssh efficient traffic compression.”

Comment: A nice step-by-step answer to part of my question the other day.  Generous responses and a little bit of experimentation seem to have got me to the next stage.   In practice, once the machine is up and running the way we like, I'm sure we'll just kill the VNC access until we feel like we need it again.  [Serious Instructional Technology]

Increasing Business Intelligence

Increasing Business Intelligence

ITtoolbox, the producer of the Business Intelligence Knowledge Base, recently launched a series of blogs that take a first-hand look into the daily challenges faced by real-world IT professionals.

Addressing business intelligence, technology strategy, project management, software development, and business analysis, the blogs are insightful, personal approaches to organizational technology challenges. The service might be worth following to learn more about shared challenges, new ideas, and practical solutions.  [Fast Company Now]

Knowledge Mismanagement

Knowledge Mismanagement.

Two interesting perspectives on knowledge management were published online recently. The essays provide thought-provoking bookends to the consideration of how information is used within organizations — and how teams collaborate. In Technology Review, Alex Pentland contends that data mining doesn't go far enough. Companies need to be “reality mined.”

Studies of office interactions indicate that as much as 80 percent of work time is spent in spoken conversation, and that critical pieces of information are transmitted by word of mouth in a serendipitous fashion. Commonplace wearable technology can be used to characterize the face-to-face interactions of employees — and to map out a company?s de facto organization chart. The new reality-mined data allow us to cluster people on the basis of profiles generated from an aggregate of conversation, e-mail, location, and Web data.

While it seems a little spooky to think that all of my conversations and personal interactions in the office could be persistently recorded, analyzed, and applied in other areas of my work life (Big Brother, anyone?), the idea got me thinking. If this information weren't necessarily available to my employers or managers, how could I use a persistent audio and video recording day to day? You know what, I think I could do a lot with such a recording. Imagine being able to record a meeting or an interview, tapping your side to set buffer markers for content you want to access later.

On the flip side of that idea is “JibbaJabba”'s blog entry “Overcoming a Clenched Fist Knowledge Culture.” Taking a look at how top-down approaches to knowledge sharing might or might not affect grassroots efforts to share information, the writer makes the case that information will be shared at the frontlines regardless of how hierarchical or bureaucratic a company is.

What?s different with where people connect on a personal level is that they do so without having to route themselves through a process or system that was created by upper management without looking into what people actually need or how they work. They do so because they need to and because without the interference of a CIO or management-chosen system, they can control the knowledge creation and flow. The difference in tools is that because they?re closely held and controlled, they may have a better chance of being used and sustained by the people who need them. They may also die when they?re no longer needed. This should be natural, if you take the view of the company as an information ecology.

That's not too far afield from traditional thinking about KM, but the overarching idea has merit. Information will be shared — knowledge will be created — regardless of colleagues', leaders', or companies' efforts to clamp down on interorganizational sharing.  [Fast Company Now]

Metadata? Thesauri? Taxonomies? Topic Maps! – Making Sense of It All

Metadata? Thesauri? Taxonomies? Topic Maps! – Making Sense of It All. Lars Garshol, Development Manager at Ontopia posted a fantastic article on the relationships between different classification tools – topic maps, ontologies, taxonomies, and more. Well worth the read, since it's a clear explanation that separates similar concepts that too often get muddled. [ia/ blogs]

Patching vs. Intrusion Prevention

Patching vs. Intrusion Prevention.

I better start this entry off by stating that for a living (what puts bread on my table) I write computer security tools and technology, with my latest research into mandatory access control driven by providing “process rights management” through host-based intrusion prevention. Ya its a mouthful, but basically I have written code that grafts onto the Windows kernel to strengthen the Windows platform by providing application confinement and isolation from the rest of the system. The result is that I can apply the rules of least privilege to resources on a machine and provide a safe containment… isolating suspect or even hostile activities from destroying a system.

The reason I am telling you this is that today I noticed a debate on Network World called “Is patch mgmt. the best protection against vulnerabilities?” in which Shavlik Technologies (for) and Sana Security (against) face off. Its hard to say this without you snickering at me, but I HATE it when you square off two vendors to make an assessment for the information security profession when both have a stake in their position. (In case you didn't know Shavlik sells patch management software and Sana sells intrusion prevention software) lt is typically biased, and slanted towards their product.

Lets get real. The reality is BOTH are right, and BOTH are wrong for different reasons. Lets look at this from an infosec point of view while understanding the mindset of an administrator responsible for the critical infrastructure of an organization.

Patch management is only effective when actually completed on a timely manner to reduce the threat of exposure from attack. If you look at the the most recent trends most attack vectors are built AFTER a patch is released, as it is much easier for an attacker to disassemble a patch to find the vulnerabilities(s) in question, and create new hostile code to exploit it. The “for” camp in this argument state that application and OS vendors don't always tell you what the patch fixes, which means you need to patch against the unknown. Here is the problem with that argument. How can an administrator of a Fortune 100 company blindly patch a system with code he knows nothing about… especially if you KNOW the vendor isn't telling you everything? They can't. Which is why they typically do a staged roll out in a 'clean room' to do regression testing against their existing architecture. And in many cases.. the patches do more harm to their system than good. Countless avenues of attack are meanwhile generated, exposing the business to more risk. The time between patch release and exploit release is shortening, as attackers get smarter in their disassembly techniques.

On the other side, the “against” camp state that because customers are not aware of new vulnerabilities they cannot defend against the new exploits… but host-based intrusion prevention software will solve it. There is a catch they don't want to tell you. Most intrusion prevention systems use a combination of signature based techniques and whitelist databases to determine access control. Problem with this is that new 0 day attacks don't play by these rules, and they can typically get around such techniques. More over, if you use a stringent set of rules of “don't”… you end up with an administrative nightmare trying to tune the IPS to work in your environment.

Proof is in how signature based solutions have failed in other security verticals. Look at antivirus and personal firewalls as an example. The latest CSI/FBI Computer Crime and Security Survey shows that of those organizations that reported breaches in the last 12 months, 98% had firewalls in place, and 99% had antivirus. Yet they were still breached. Does that mean we throw the technology out? No. It just means that they don't work alone, in isolated environments. And how much MORE extent would the breach have been WITHOUT the technology in place?

To properly defend against the digital divide, we need to use a layered defensive posture which includes it all. We should have firewalls, antivirus, network IDS, host-based IPS and patch management. Our decisions have to be of a BIGGER process in the security management lifecycle. (This is why Schneier says security is a process, not a product) Remember when I was talking about the 8 rules of Information Security last year? Using a defensive posture like this touches on almost every rule:

  • You need to control change management. (Rule of Change Management) You cannot blindly apply patches, but you need to be vigilant and ensure all systems are up to date. Staged roll out is one way to test the change management process and fully understand the implications of the changes.
  • Its not just about applying technology and leaving it alone. (Rule of the Three-Fold Process). You must include monitoring and administration to ensure you keep up to date. Patch management systems are great for this.
  • You must consider everything hostile, and then slowly allow things to happen on your systems. (A combination of the Rule of Least Privilege and the Rule of Trust). Host-based intrusion prevention is perfect for that, when properly tuned and in force.
  • Keeping up with patches ensure that you are strengthening your weakest points at all time. (Rule of Preventative Action)
  • Host-based intrusion prevention not only DETECTS attacks, it can PREVENT them. Thats the whole point of them. As such, you can immediately respond to threats as they occur in a sane manner using the logs/reports to provide forensic audits of the attack. (The Rule of Immediate and Proper Response). Hell, my IPS system will even go so far as to terminate the attack in mid-execution if it matches certain criteria (defined by the adminstrator of course)

My point here folks is that as vendors, we sometimes seem to use FUD or “tainted” messeging to sell out products. Don't buy into it. (And if you ever see my company do it, please email me with stern warning and point me to this entry) Always consider the bigger picture in your security management lifecycle when evaluating technology. After all… technology is simply an enabler. Its not the solution!

Oh… and if you ARE going to roll out host-based intrusion prevention on Microsoft Windows servers, contact me. I might be able to help you out. 🙂  [Dana Epp's ramblings at the Sanctuary]

Proactive sales in a small ISV

Proactive sales in a small ISV.

Eric has posted a great MSDN article on the function of proactive sales in a small ISV.

I love reading his insights on this sort of thing. As the owner of a small ISV myself I completely understand and relate to his writings. This article is no different. Wrestling the question about when and if to get a sales guy is important, and understanding a sales guys traits it quite difficult. He answers this question quite well.

Good job Eric.  [Dana Epp's ramblings at the Sanctuary]

Writing SW for Users

Writing SW for Users

Clay Shirky has just posted a terrific (and fairly long) article on what he calls Situational Software. 

By that, he means software that is written by members of a small group for its own use.  He contrasts that sharply with what he calls Web software, by which he means more traditional software, written by programmers for large, generic groups of users, which must be written to scale, and be durable.

He makes great arguments for why such software can be much more useful, since it can omit things that the group already know about itself (such as the reputations of individual members) and it can rely upon group dynamics for some tasks not included in the software itself.  (For example, it offers a student buying system with no formal method for dealing with non-payment, since the students could deal with that via suasion or ostracism.) 

This is not an argument for abandoning formal software development, but rather a recognition that it is useful to also have informal software that is developed closer to its users, and intended for less formal, shorter usage, just for a particular group or purpose.  He also makes the argument that as the number of formal sw developers in the U.S. may diminish (which has been predicted), this informal programming, a new part of many people's skill sets as how we program changes, may grow — just as typists and secretaries have all but disappeared, but all of us create and enter our own data today.  [amywohl News]

Downloading doesn't hurt sales: new study

Downloading doesn't hurt sales: new study.

“Downloads have an effect on sales which is statistically indistinguishable from zero.” “The Effect of File Sharing on Record Sales: An Empirical Analysis” (.pdf file), a new study from two professors at UNC and Harvard Business, finds that internet downloading of copyrighted, commercial materials did not harm sales during the study's timeframe.  [Smart Mobs]