eSites Network Website Design


Some Management Fundas

Once it was called ‘work satisfaction’, then ‘commitment’ and now ‘engagement’. Its opposite is ‘alienation’, estranged, from all that happens in the workplace.

All managers want their staff to be fully committed to the aims of the organisation, happy in their work and totally engaged in what they are doing.

So how to achieve engagement? Indeed, is it even possible to engage people doing unskilled, dreary, repetitive work? And is engagement an end in itself, or does it lead to other desirable outcomes such as productivity, profitability, staff retention and customer satisfaction?

The research in this area shows pretty consistent findings. The results are neither surprising nor counterintuitive. And they have been known for ages. So why is it that supervisors and managers do not perform their duties so as to maximize the commitment and engagement of their staff?

There are some fairly basic but important things a manager needs to do to maximise engagement.
  1. Let every person know what is expected of them in terms of their processes and products. Be clear. Check understandings. And revisit expectations as they change. All people have hopes and expectations about promotion, about change, about what their organisation should be doing for them (and they for the organisation). These expectations need to be managed.

  2. Give people the tools for the job. Keep them up to date. Train them how to use these tools. Make sure that processes are well thought through so that the technology people use is appropriate for what they are required to do. In short, give technical and informational support.

  3. Give reports and opportunities to learn and shine at what they are good at. People like to celebrate their skills, abilities and unique gifts. Help them find and explore them. Let them do their best all the time. And encourage development of strengths.

  4. Be generous but targeted in praise. Recognise effort and success. Recognise individuals and how they strive to achieve. Celebrate success. Notice and praise individuals when they have put in extra effort. And do it openly, naturally and regularly.

  5. Listen to your employees. They often have very good innovative ideas. Yes, they can and do complain but listen to that too. They need to believe their ideas count, their voice is heard, they can contribute to how the work is arranged.

  6. Help them believe in the purpose or product of the organisation. People need to feel their job is important; that they really are making a contribution to society. This involves more than writing fancy mission statements. It’s about giving the job a sense of meaning and purpose.

  7. Encourage friendship formation at work. This is more than insisting on teamwork. It is giving people space and time to build up a friendship network. Friends are a major source of social support. They make all the difference to the working day. And committed people commit their friends.

  8. Talk to people about their progress. Give them a chance to let off steam; to dream about what might be; to have quality time with you. This is more than those detailed, often rather forced appraisals. It is about opportunity for the boss to focus on the hopes, aspirations and plans of the individual.
Pretty obvious stuff. Be clear about what you want. That is, define the outcomes required for individuals which will strength and challenge them. Focus on what they do well: their strengths, gifts and talents. Try to find the best individual and the best in the individual. Make them exemplars, heroes, models. Find the right fit between a person’s talents and ambitions and the tasks they need to do.

Look for ambitious, achievement-oriented, energetic individuals. But steer their striving: manage their route-map. And look for, listen to and reward evidence of independent ideas and thinking. Never assume management has a monopoly on the truth. Also encourage camaraderie: help people who are social animals relate to each other and pull together.

Do all of the above and you have an engaged workforce. And we do know that happy, healthy, staff treat customers better. It’s a relatively simply causal link. It pays to focus on staff engagement. But it’s also the fundamental task of all management.

Your SME Network Looking Like An ATM?

An ATM that you get money out of should be as secure as a bank vault, but just like that ATM has to be filled up everyday by someone - your network can become vulnerable from someone on the inside or outside of the network.

Systems used to be like ATM's - you give the right code and the ATM gave you the cash, then hackers got to them and the security had to be strengthened to the point where hackers now needed an inside man. Guess what, the inside man came in the the form of email phishing scams targeting unknowing consumers to give up the code and then out comes the cash.

Then to give regular firewalls a real workout are blended threats - viruses, worms, trojans, root kits, and other hacks. Running a successful business adds to your responsibilities - more employees, larger networks, larger databases, and possible teleworker VPN options - definitely a job for an appliance that can handle blended threats.

The sights are now on us...

Let's look first at the changes in the SME network security environment and how blended threats have started to trickle down to us little guys. Multinational and large enterprise networks have always fought blended threats - spyware prevention, root kit attacks, spam blocking, intrusion prevention and URL filtering are what the big boys IT departments are using on a daily basis at the gateway to the Internet. Most of the time the IT guy is able to hold back the unknown threats with layered security - yes I said unknown threats. Each one of the threats we know about today became infamous once the damage was done or the attack was blocked to a degree where the loss was not significant. These threats began as unknowns.

And that's the good news, when the big boys trigger technological innovations, the little people always get a taste. The difference is that when an unknown threat attacks our firewall, hopefully we all have some form of security - right, the damages and loss can be catastrophic. But all businesses in the small and medium environments can't have a separate IT department that handles security, storage, compliance and the money to handle downtime due to lost productivity and data. All businesses of any size are looking for ways to effectively prevent attacks right at the perimeter before it reaches the desktop.

Prevent Your SME Network From Looking Like an ATM

Blended threats have met their match when it comes to Unified Threat Management (UTM) devices, our much larger brethren were tired of buying a new security solution every time a new threat popped up it's ugly head threatening the network. Their budgets may look unlimited but the bean counters began complaining about the bleeding edge of security even if they could afford the attacks. Integrated security appliances help answer both IT and CFO's dreams by incorporating everything blended threats can throw at a network - blocking viruses, worms, spyware, trojans, and other attacks without relying on signatures. Signatures are needed based partly on the effectiveness of 'intelligent layered security' and host intrusion detection. Threats are met at the perimeter or rejected by layered security that intelligently adapts to threats before it hits the internal network. This is just a small unified threat management overview to touch upon how it acts as intrusion detection and intrusion prevention devices in one box.

If you think your business is already secure enough and doesn't need a UTM, a combined intrusion detection and prevention appliance, then maybe when the unknown threats attack your network it will keep them away from the desktops in your office. I hope you can sleep well...because your network can never go to sleep.

Time To Go Thin?

Juicy Assets, Ripe For Picking...

So here's an interesting spin on de-perimeterisation (removing the boundary between the internal network and the internet)... if people think we cannot achieve this and cannot wait for secure operating systems, protocols and environments but need to "secure" their environments today, I have a simple question supported by a simple equation for illustration:

For the majority of mobile and internal users in a typical company who use the same basic set of applications:

  1. Assume a company that:...fits within the 90% of those who still inhouse servers and isn't completely outsourced and supports a users who use Microsoft OS and the usual suspect applications on fat clients and laptops.

  2. Take the following:
    Data Breaches. Lost Laptops. Non-sanitized corporate hard drives on eBay. Malware. Non-compliant configurations. Patching woes. Device Failures. Remote Backup issues. Endpoint Security Software Sprawl. Skyrocketing security/compliance costs. Lost Customer Confidence. Fines. Lost Revenue. Reduced budget.

  3. Combine With:
    Cheap Bandwidth. Lots of types of bandwidth/access modalities. Centralized Applications and Data. Any Web-enabled Computing Platform. SSL VPN. Virtualization. Centralized Encryption. Lots of choices to provide thin-client/streaming desktop capability. Offline-capable Web Apps.

  4. Shake Well, Re-allocate Funding, Streamline Operations and "Security"...

  5. And, Ta Da, You Get...:
    Less Risk. Less Cost. Better Control Over Data. More "Secure" Operations. Better Resilience. Assurance of Information. Simplified Operations. Easier Backup. One Version of the Truth (data.)

Why? Can Someone Tell Me Why?

I really just don't get it why we continue to deploy and are support platforms we can't protect, allow our data to inhabit islands we can't control and at the same time admit the inevitability of disaster while continuing to spend our money on solutions that can't possibly solve the problems.

Until the operating systems are more secure, the data can self-protect and networks to "self-defend," why do we continue to focus on the fat client PCs which are a waste of time.

If we can isolate and reduce the number of ways of access to data and use dumb platforms to do it, why aren't we?

...I mean besides the fact that an entire industry has been leeching off this mess for decades...

I'll Gladly Pay You For Solution Today...

The technology exists TODAY to centralize our most important assets and allow our workforce to accomplish their goals and business to function better without the need for data to actually "leave" the servers in whose security we have already invested so much money.

Many people are doing that with their servers already with the adoption of virtualization. Now they need to do with their clients.

The only reason we're now going absolutely stupid and spending money on securing endpoints in their current state is because we're CAUSING not just allowing data to leave our enclaves. In fact with all this BlaBla 2.0 hype, we've convinced ourselves that we must. Utter Hogwash.

Relax, Keep Your Firewalls On...

In the case of centralized computing and streamed desktops to dumb/thin clients, the security perimeter still includes our servers and security castles, but also encapsulates a streamed, virtualized, encrypted, and authenticated thin-client session bubble. Instead of worrying about the endpoint, that's nothing more than a flickering display with a keyboard/mouse.

Let your kid use Limewire. Let Uncle Bob surf www. Let wifey download spyware. If my data and applications don't live on the machine and all the clicks/mouseys are just screen updates, what do I care?

Yup, you can still use a screen scraper or a camera phone to use data inappropriately, but this is where balancing risk comes into play. Let's keep the discussion within the 80% of reasonable factored arguments. We'll never eliminate 100% and we don't have to in order to be successful.

Sure, there are exceptions and corner cases where data does need to leave our embrace, but we can eliminate an entire class of problem if we take advantage of what we have today and stop this endpoint madness.

This goes for internal corporate users who are chained to their desks and not just mobile users. Oh, and did I forget to mention the hugely reduced cost of ownership...

What's preventing you from doing this today?

Simple But Effective Search Engine Optimization

Most web designers see search-engine optimization (SEO) as a dirty trick, and with good reason. Most search engine optimizers pollute search engine results with spam, making it harder to find relevant content when searching. But there is more than one type of search-engine optimization. In common usage, a black-hat SEO seeks to achieve high rankings in search engines by any means possible, whereas a white-hat SEO seeks to code web pages in a way that is friendly to search engines.

By using XHTML and CSS for an effective search-engine optimization, many web design best practices overlap with those of a white-hat SEO. The reason is simple: such practices as separating style from content, minimizing obtrusive JavaScript, and streamlining code allow search engines to more easily spider, index, and rank web pages. In addition, high accessibility in web design overlaps heavily with effective white hat search-engine optimization.

Accessibility for search engines

On further reflection, this overlap makes sense. The goal of accessibility is to make web content accessible to as many people as possible. We can think of search engines as users with substantial constraints - they can’t read text in images, can’t interpret JavaScript or applets, and can’t view many other kinds of multimedia content. These are the types of problems that accessibility is supposed to solve in the first place.

A few checkpoints for accessibility

Having seen why high accessibility overlaps with effective search-engine optimization, let's see how it does so. Let's touch upon each Priority 1 checkpoint in the W3C Web Content Accessibility Guidelines which affects search-engine optimization.

1.1 Provide a text equivalent for every non-text element (e.g., via “alt”, “longdesc”, or in element content)...

Not only are search engines unable to understand image and movie files, they also cannot interpret any textual content that is based on vision. alt and longdesc attributes therefore help them understand the subject of any such content.

Search engines are also deaf in reference to audio files. Again, providing textual descriptions to these files allows search engines to better interpret and rank the content that they cannot hear.

1.2 Provide redundant text links for each active region of a server-side image map.

Text links are very important to search engines, since anchor text labels the content of a link’s target page. In fact, many search engine optimizers consider anchor text to be the single most important factor in modern search algorithms. If a website uses an image map rather than a text-based menu as the primary navigational method, a redundant text-only menu elsewhere on the page will give search engines additional information about the content of each target page.

4.1 Clearly identify changes in the natural language of a document’s text and any text equivalents (e.g., captions).

Major search engines maintain country and language-specific indexes. Specifying the language of a document or of text within a document, helps search engines decide in which indexes to place it.

6.3 Ensure that pages are usable when scripts, applets, or other programmatic objects are turned off or not supported [...]

Some users choose to disable JavaScript and applets in their browser’s preferences, while other users’ browsers do not support these technologies at all. Likewise, search engines’ browsers do not read scripts; therefore a webpage’s usability should not be crippled when scripts are not supported. Otherwise, search engines may not even index the page, let alone rank it well.

14.1 Use the clearest and simplest language appropriate for a site’s content.

It is a bit less obvious how this particular checkpoint aids search-engine optimization. If a website contains the clearest and simplest language appropriate for the site’s content, it is probably using those keywords with which potential searchers will be most familiar. Searchers tend to use succinct queries containing familiar language. Thus, to receive maximum traffic from search engines, it is best that a website contain the same words which the site’s audience will use when searching.

The benefits do not end with Priority 1 — many of the Priority 2 and 3 Checkpoints are important for search-engine optimization purposes, too. For instance, Checkpoints 6.2 and 6.5 refer to the accessibility of dynamic content. In fact, making dynamic content search engine-friendly is one of the most daunting tasks a search engine optimizer faces when working on an eCommerce or a database-driven site. Following the W3C’s recommendations can help to avoid any indexing or ranking problems related to using dynamic content.

From the horse’s mouth

If you doubt any of the above, perhaps a visit to Google’s Webmaster Guidelines could convince you that Google rewards high accessibility. This page specifically mentions best practices which will help Google “find, index, and rank your site.”

  • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
  • Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
  • Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images.
  • Make sure that your title and alt tags are descriptive and accurate. [...]
  • Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site. 

Note that each of Google’s guidelines actually correlates with a W3C Web Content Accessibility Guideline. Oddly enough, the word “accessibility” does not actually appear in Google’s Webmaster Guidelines. Perhaps they are afraid of scaring off some webmasters with technical jargon? In any case, it is clear that Google is lobbying for high accessibility.

Another feather in accessibility’s cap

The checkpoints highlighted above are just a few of the many ways that high accessibility helps optimize a website for search engines — many of the other checkpoints in the W3C Web Content Accessibility Guidelines are helpful for search-engine optimization as well. If accessibility gets a website more traffic from Google, even better!

The good news is that a web designer who follows best practices for accessibility is already practicing solid white hat search-engine optimization. Search engines need not scare anyone. When in doubt, design your site to be accessible to blind and deaf users as well as those who view websites via text-only browsers, and search-engine optimization will fall into place automatically.

Why It Fails?

When organizations go online, they have to decide which e-business models best suit their goals. A business model is defined as the organization of product, service and information flows, and the source of revenues and benefits for suppliers and customers.

Automation is uniquely difficult because its complexity extends beyond your company's walls. Your people will need to change the way they work and so will the people from each supplier/distributor/customer. Only the largest and most powerful manufacturers can force such radical changes down their throats. Most companies have to sell outsiders on the system. Moreover, your goals in installing the system may be threatening to them, to say the least.

  • Internal Resistance to Change
  • External Resistance to Change
  • Many Mistakes at First
  • Historical and Accurate Data
  • Skilled Manpower
  • Continuous Training and Upgradation
  • Planning and Implementation
  • Management Support

Resistance to change: Operations people are accustomed to dealing with phone calls, faxes and hunches scrawled on paper, and will most likely want to keep it that way. If you can't convince people that using the software will be worth their time, they will easily find ways to work around it. You cannot disconnect the telephones and fax machines just because you have a software in place.

Many mistakes at first: There is a diabolical twist to the quest for software/automation acceptance among your employees. New systems process data as they are programmed to do, but the technology cannot absorb a company's history and processes in the first few months after an implementation. Forecasters and planners need to understand that the first bits of information they get from a system might need some tweaking. If they are not warned about the system's initial naiveté, they will think it is useless.

In one case, just before a large automotive industry supplier installed a new supply chain forecasting application to predict demand for a product, an automaker put in an order for an unusually large number of units. The system responded by predicting huge demand for the product based largely on one unusual order. Blindly following the system's numbers could have led to inaccurate orders for materials being sent to suppliers within the chain. The company caught the problem but only after a demand forecaster threw out the system's numbers and used his own.

That created another problem: Forecasters stopped trusting the system and worked strictly with their own data. The supplier had to fine-tune the system itself, then work on reestablishing employees' confidence. Once employees understood that they would be merging their expertise with the system's increasing accuracy, they began to accept and use the new technology.

Historical and Accurate Data: Computers work on the GIGO principle – Garbage In Garbage Out. Any computer system is only as good as the data/information that it is feed. Inaccurate data or not enough data both are one of the invisible causes of failure of information systems.