Wednesday, November 30, 2011

Bill Gates Testifies: Microsoft Beat WordPerfect Fair and Square

Bill Gates testified in a Salt Lake City courtroom Monday in a lawsuit that accuses Microsoft of — surprise! — monopolistic behavior. The accuser: former WordPerfect owner Novell.

MCTS Certification, MCITP Certification

Best Microsoft MCTS Certification, Microsoft MCITP Training at examkingdom.com



If you’re old enough to remember Buffy the Vampire Slayer as a lame movie, before it became a hip TV show, you may also recall a promising challenger to Microsoft Word called WordPerfect, which was bought by Novell in 1994.

WordPerfect was once a powerful brand, and it still has a following in academic and legal circles thanks to its differentiating features (such as streaming codes, which are similar to HTML tags). WordPerfect even had larger market share than Word in the 1980s and early 1990s, when the operating system of choice was DOS.

Then came Windows. After its debut in 1989, Word for Windows grew quickly while WordPerfect stumbled into the party late in 1992 with a buggy version.

The release of Windows 95 sealed its fate. There wasn’t a Win 95 version of WordPerfect until May 1996, nine months after Word 95 debuted and continued to eat up market share. Novell ended up selling WordPerfect that year, to Corel, for what it says was a $1.2 billion loss.

In 2004, Novell finally cried foul. The company says WordPerfect never got a fair shot on Windows 95 since Microsoft shut it out of the development process, ostensibly in favor of Word. Novell names Gates himself, claiming he ordered Microsoft engineers to reject WordPerfect as a Windows 95 application because it was too good.

Gates himself took the stand Monday to give his side of the story. While questioned by Microsoft lawyer Steven Holley, Gates said he denied the central argument of Novell’s suit — that the software giant withheld elements of Windows 95 that undermined WordPerfect.

Gates said that creating Windows 95 was the “most challenging, trying project we had ever done.” He admitted that the development team removed a technical feature of the operating system that would have supported WordPerfect, because he believed it might crash Windows. In the end, Gates argued, Novell didn’t innovate fast enough, and Word was the better product.

The numbers tend to support Gates’ argument. Stan Liebowitz of the University of Texas looked at the market share of Word vs. WordPerfect since 1986, and it clearly shows Word shooting up fast while WordPerfect sinking from its once-dominant position, starting in the early 1990s.

word processor market share

Would a faster (and less buggy) release on Windows 95 have turned the tide? Or at least given WordPerfect a fair shake? That’s the central question U.S. District Judge J. Frederick Motz will have to answer. But let us know your take in the comments.

Tuesday, November 29, 2011

RIM moves to higher mobile ground with BlackBerry Mobile Fusion: Is it too late?

The new plan for RIM revolves around focusing on what it does best for mobile device management software—asset management, configuration, security policies, group administration and centralized management.

MCTS Certification, MCITP Certification

Microsoft MCTS Certification, MCITP Certification and over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com


Research in Motion on Tuesday outlined plans to launch BlackBerry Mobile Fusion, enterprise software designed to manage a bevy of mobile devices including the iPhone and Android smartphones.

With RIM’s smartphone share taking its knocks, the company can’t afford to rely on selling a complete mobile stack—BlackBerry device, BlackBerry Enterprise Server and RIM management software—-any more. BlackBerry Mobile Fusion will be available in March 2012.

The new plan for RIM revolves around focusing on what it does best for mobile device management software—asset management, configuration, security policies, group administration and centralized management. The challenge for RIM here is obvious: There are multiple mobile device management software providers and the BlackBerry Enterprise Server doesn’t have the lock-in it once did.

RIM appears to be trying to thread the needle between the bring your own device movement and selling its BlackBerry stack of mobile hardware and software.

How will this turn out? Here are three scenarios:

Best case: RIM’s focus on security and enterprise management puts it at the top of the mobile stack. RIM brings its security and enterprise management knowhow to a bevy of devices. RIM’s enterprise foothold gives it a leg up.

Middle-of-the-road case: RIM’s BlackBerry Mobile Fusion effort is a bit late, but manages to keep the company relevant even as it fades on smartphones. Companies loyal to the BlackBerry Enterprise Server naturally gravitate to Mobile Fusion. Other CIOs, however, look to other mobile device management suites offered by Sybase, Good and a bevy of other players that include Microsoft and Google.

Worst case: BlackBerry Mobile Fusion is viewed as a Hail Mary pass that comes too late. Technology executives begin to wonder why they need RIM as a mobile device management middleman when employees aren’t bringing BlackBerry devices to work.

It’s unclear how this BlackBerry Mobile Fusion effort will pan out. All of the cases outlined above are likely to have an equal probability of actually happening.

RIM’s move does remind me of a Clayton Christensen talk about innovation. Christensen, a Harvard professor, outlines innovation conundrums through the years. The common theme in multiple examples is how companies cede the lower ground in a market to move upstream to higher margin products. If you follow profit margin religion, you’re likely to outsource and give up on tough markets. The higher ground always looks better. The issue is that companies eventually run out of headroom and nothing is left. See Smart Planet: Clay Christensen: 5 observations on innovation

It’s a bit of a stretch to argue that RIM is ceding the device market in a bid to move up the mobile stack, but the writing—beginning with Mobile Fusion—may be on the wall.

Sunday, November 27, 2011

Physicians using tablets to treat patients

Within the next year, almost half of all doctors will be using tablets and other mobile devices to perform everyday tasks, such as accessing patient information in electronic medical records (EMRs), according to the survey by the Computing Technology Industry Association (CompTIA), a nonprofit group.

MCTS Certification, MCITP Certification

Best comptia A+ Training, Comptia A+ Certification at examkingdom.com



Today, a quarter of healthcare providers surveyed say they're using tablets in their practice. Another 21% indicated they expect to do so within a year.

CompTIA's Third Annual Healthcare IT Insights and Opportunities study was based on two seperate online surveys: One focused on 350 doctors, dentists and other healthcare providers or administrators; the other polled 400 IT firms with healthcare IT practices. Both were conducted in late July and early August.

The study shows that more than half of healthcare professionals currently use a smartphone for work, and about a third use their smartphones or tablets to access EMR systems. Another 20% expect to start mobile usage with EMRs within the next year.


The U.S. government is pressing all medical facilities to roll out EHRs by 2016, but the going has been slow so far. By the end of next year, 58% of small physician practices are expected to have EHR systems in place.

By 2014, the federal government wants more than half of all healthcare facilities to use EHRs.

Facilities that roll out the systems -- and prove their meaningful use, according to federal standards -- can receive tens of thousands of dollars in reimbursement money under the American Recovery and Reinvestment Act of 2009. Some facilities, depending on their size, could get millions.

EHRs are also expected to promote the use of standardized medical practices.

CompTIA's survey also showed EMR system adoption is on the rise, with 38% of healthcare providers indicating they have a comprehensive system in place and 17% saying they have a partial system or module. Sixty-one percent of the EMR said they're generally satisfied with their comprehensive systems.

"That's a respectable figure, but one that also indicates there's room for improvement in areas such as greater ease of use; better interoperability with other systems; faster speeds; improved remote access and mobility features; and more training," CompTIA said.

Doctors have reported feeling like data-entry clerks when typing their own notes into EMR systems. Others are unfamiliar with the technology and see it as yet one more learning curve to conquer in their jobs.

"We're a teaching hospital, so on one end of the spectrum we have residents born with a computer in their hand and so they look at this as an opportunity to move forward; on the other side of the spectrum are the physicians [who] are in their 60s and the very idea of signing onto a computer is a big question mark to them," said Bill Fawns, director of IT services at Kern Medical Center in Bakersfield, Calif.


Late last year, Kern Medical Center, a 222-bed acute-care teaching hospital, deployed an OpenVista EMR system from Medsphere Systems Corp.

"As mobile devices and applications have become more user-friendly, affordable and powerful, the appeal to businesses of all types, including healthcare providers, has grown exponentially," Tim Herbert, vice president of research at CompTIA, said in a statement.

The survey also touched on the adoption of cloud computing in the healthcare industry. The results showed cloud computing is clearly in its early stages, with 57% indicating they were not very familiar with the technology and just 5% stating they were using cloud services.

"It's worth noting, though, that some healthcare providers are likely using cloud-based applications, like software-as-a-service, and not thinking of it as cloud computing," the report stated.
Health care and IT

Feds back off on Jan.1 eHealth standards deadline
Physicians using tablets to treat patients
New computer can diagnose breast cancer better than docs
How e-health records improve healthcare: A cancer patient's story
Rite Aid rolls out in-store virtual doctor visits
Are certifications less crucial for healthcare IT jobs?
IBM's Watson supercomputer to diagnose patients
Healthcare industry leads market in IT hiring
Largest online doctor network launches iConsult app for iPhone
Tenn. BlueCross finishes enterprise-wide data encryption

More in Health Care

The potential for its growth is strong. A key component of EMR meaningful use standards is the ability to share information, either through proprietary networks or through regional Health Information Exchanges, which will include many of the elements of cloud computing.

Adoption of telemedicine, where physicians consult with patients with teleconferencing, is still a ways off, the report said. Recent studies have shown that patients can be just as effectively treated through telemedicine as through traditional in-person visits. In fact, studies have shown telemedicine actually improves patient-physician communication.

Hoewever, just 14% of healthcare professionals reported actively following news and trends in telemedicine, according to CompTIA's survey.

Those surveyed indicated they saw the greatest benefits of telemedicine in the areas of continuing medical education (61%), specialist referral services (44%) and patient consultations (37%). Only one in 10 healthcare providers surveyed say they intend to use video conferencing for patient interaction within the next 12 months.

Saturday, November 26, 2011

Better Windows support due on IBM mainframes

IBM mainframes will soon be able to manage Windows applications, bridging one of the last major divides in data centers.

IBM had already announced that it intended to deliver that capability with its zEnterprise 196 mainframe, but it recently said the Windows management function will be available on Dec. 16.
MCTS Certification, MCITP Certification

Microsoft IBM Exam Training, IBM Certification and over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com


There are many Windows-based applications, including ones made by IBM, inside most data centers; they typically interact with mainframes to access data. Historically, all Windows software has had to be managed separately.

But now, IBM has promised, the security and speed of mainframe environments that include Windows systems will be improved. It will be possible to connect systems on a private network, thus avoiding some network hops and enabling the use of integrated management tools.

Joe Clabby, an analyst at Clabby Analytics, said the new features should reduce the labor required to run mainframe environments that have multiple operating systems. Moreover, he added, "if you can manage this as a single architecture, it saves money."

Greg Lotko, business line executive in IBM's System z division, said the addition of Windows support "is really recognizing that the world is heterogeneous."

This version of this story was originally published in Computerworld's print edition. It was adapted from an article that appeared earlier on Computerworld.com.

Friday, November 25, 2011

Shaping the Cloud

HP Labs research explores how next-generation cloud systems might be built

It’s not easy to shape a cloud. That goes for cloud computing systems as much as the atmospheric phenomena for which they’re named.

Cisco CCNA Training, Cisco CCNA Certification

Best HP Certification Training and HP Exams Training and more Cisco exams log in to Certkingdom.com

=

But as cloud computing becomes ever more central to IT operations, how these intentionally diffuse systems are assembled will have an increasingly significant impact on what they’re able to do and how well they can run.

That’s the thinking at HP’s Cloud and Security Lab in Bristol, UK, where a novel approach to cloud computing, based on highly secure but flexibly managed cells, is being researched.

“It’s based on a completely new kind of architecture,” says Julio Guijarro of the model that he and his colleagues are calling Cells as a Service. “The vision we had required a lot of componentry that didn’t exist,” he adds, “so we had to build all that first in order to make it work.”

A new model

The Cells as a Service model grew out of work the Bristol team has been engaged in for over a decade.

Early investigations into Utility and Grid computing led them to want to build their own platform for deploying cloud-based services, says Guijarro. “That was generating a lot of interest among HP customers,” he recalls, “but nobody was providing the infrastructure that we also needed to build such a platform. So we went down a level and started working on that.”

Much of their early research had involved supplying digital animation companies with remote rendering services, an experience that underlined the degree to which IT customers need to be confident that they are protecting their intellectual property if they’re to migrate to lower-cost, cloud based systems.

As a result, explains fellow researcher Patrick Goldsack, “when we started to think about architecture, we started by asking how you can really provide a secure framework from the ground up, within which we can then deliver different services to different customers.”

A truly secure system, notes Goldsack, ought to be able to serve commercial rivals like Dreamworks and Pixar, or Pepsi and Coca Cola, at the same time, with each 100% confident that their own data is accessible only to themselves. At that same time, though, it should also allow them to collaborate if and when they wish. Banks, for example, might be rivals in some operations and partners in others.

For that to happen, adds Guijarro, “You have to build into the design the ability for A to talk to B when they want to, and for B to talk to A. That’s one of the main differentiators between our system and anything else that is out there today.”

Cells in two senses

The new infrastructure, like other cloud systems, is built out of virtualized machines created automatically and on the fly over sets geographically-distributed servers.

But in this model the virtualized servers have the quality of cells – both in the sense of biological building blocks and of highly secure holding pens for information.

“Each cell has a semi permeable membrane with a tight control over that permeability,” notes Goldsack. “As a result, you can specify your relationship with other cells in the system and the level of permeability you have between the various other cells. In addition, you can alter those levels later on if you wish.”

Existing cloud models typically serve multiple users by requiring that every user own the same kind of rights over the same kind of machines. But thanks to its novel design, the Cells as a Service model offers much more flexibility while at the same time offering the kind of security that cloud customers need.

Underlying innovations

In order to build this new architecture, the HP team had to create an array of underlying technologies that were themselves novel.

“The networking is completely new,” explains Guijarro. “It doesn’t require any special switches or anything. Instead, we virtualized the entire network in software, which gives us a strong control of what happens in the network layer.”

In addition, the Labs team reinvented how virtual disks connect up to virtual machines.

Traditionally, when you create a virtual machine, you create a virtual disk from which it will boot that contains a certain amount of data. It’s possible to create a virtual machine very fast, but as virtual disks have become ever larger they’ve been taking ever longer – over an hour sometimes – to create and copy from a base image.

“We get the performance up using a number of techniques such as copy-on-write and clever caching algorithms,” says Goldsack. “We can do per-volume encryption, and impose access control to the volumes. It’s an entirely new approach to virtual storage appropriate to the cloud.”

A third major innovation is in how the virtualization process is automated. “It’s a very highly distributed, disaggregated asynchronous mechanism, which scales out very beautifully,” Goldsack reports.

A popular demonstrator

The Cells as a Service model has recently been integrated into HP’s G-Cloud demonstrator at HPL Bristol. Now one year old and in its second generation, the demonstrator was built to offer governments a sense of how cloud computing might help them deliver core services much more cost-effectively and efficiently.

Interest so far has been ‘amazing,’ says Goldsack, who notes that corporations are proving as interested as government agencies in sending their senior technology executives to visit.

The tours, he notes, “allow us to share our vision of what Cloud can do for you, what the issues are that you might think about and how we are approaching some of those issues like security, privacy, speed, reliability and high availability.”

At the same time, the visitors help the HP team understand the concerns that potential customers have about the cloud and where they are in the process of moving their services into the cloud environment.

Back to the platform

Next the team wants to move back up the software hierarchy to where they started originally: to look at the kinds of platforms they can build on a Cells as a Service infrastructure.

“That’s really where our interest has always been,” notes Goldsack. “And we’ll likely start doing some work with the client side, too, looking at better ways to bring clients into the cloud.”

After that, they’ll likely create model software solutions to show off the unique capabilities of the Cells as a Service infrastructure.

“This is just the beginning,” suggests Guijarro, “and I don’t think anybody knows where this is going to go.” All along, he notes, the idea has been to give people confidence to move more to the cloud – a move that ought, in turn, to let them to be more innovative in the services they offer.

“It’s one of the most interesting aspects of the research that we do,” he says, “that we’re enabling people to do things that nobody has thought of yet.

Wednesday, November 23, 2011

Intel recasts Pentium chip for servers

Intel is giving new life to its Pentium processor for servers, and has started shipping the new Pentium 350 chip for low-end servers.

The dual-core processor operates at a clock speed of 1.2GHz and has 3MB of cache. Like many server chips, the Pentium 350 lacks features such as integrated graphics, which are on most of Intel's laptop and desktop processors.

MCTS Certification, MCITP Certification

Best CCIE Training and CCIE Exams and more Cisco exams log in to examkingdom.com



The iconic Pentium line of processors has been around for more than a decade, but now is mostly targeted at budget laptops and desktops. Pentium was Intel's flagship PC processor line, a mantle now held by the Core chips. The company once offered Pentium III and Pentium II Xeon processors for servers.

An Intel spokesman said the chip is targeted at microservers, which are low-power, compact servers for Web serving and content delivery services. Intel already offers Xeon E3 chips and is soon expected to launch new chips based on Atom for microservers.

The new processor is an acknowledgement of the Pentium brand's staying power, said Dean McCarron, principal analyst at Mercury Research. Besides microservers, the Pentium 350 could also be used in inexpensive, task-specific servers for storage, printing or document sharing, according to McCarron.

"What we're seeing is a repurposed part," McCarron said. The Pentium 350 is a cheaper alternative to Intel's PC chips, which could also be used in servers but are more expensive with additional features such as integrated graphics.

The new processor draws 15 watts of power and there's a remote chance it could be used in blades, McCarron said. The processor is, however, not a replacement to Intel's current low-power Atom processors. These are typically for netbooks and tablets, but are also being used in high-density servers such as SeaMicro's SM10000-64HD to process cloud transactions.

Targeting the new Pentium chip at servers could also be a tacit acknowledgement that Intel wants Pentium to replace the Celeron brand, which is the lowest rung of Intel's processors. Celeron processors are used in low-cost desktops and laptops, and in a few cases, low-end servers.

Intel declined to provide pricing for the Pentium 350.

Tuesday, November 22, 2011

Oracle: HP paying Intel to keep Itanium going

Hewlett-Packard has secretly contracted with Intel to keep making Itanium processors so that HP can maintain the appearance that "a dead microprocessor is still alive", and make money from its locked-in Itanium customer base and take business away from Oracle's Sun servers, Oracle said in a court filing on Friday.

MCTS Certification, MCITP Certification

Microsoft Oracle Exam Training , Oracle Certification and over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com


The market has never been told that Itanium lives on because HP is paying Intel to keep it going, Oracle said. Intel's independent business judgement would have killed off Itanium years ago, it added.

HP however described the filing as a "desperate delay tactic designed to extend the paralyzing uncertainty in the marketplace" that it said was created when Oracle announced in March, 2011, in a breach of contract, that "it would no longer support HP's Itanium platform".

HP has made statements to the marketplace to the effect that Intel's commitment to Itanium is its own, based on its normal calculations for investing in processors that it believes have a future, Oracle said in a filing before the Superior Court of the State of California for the County of Santa Clara.

A public redacted version of the filing was made available to The Wall Street Journal's AllThingsD blog. Intel said it is not a party to the lawsuit, and therefore does not have any comment on it. "Intel does not comment on commercial agreements that we may or may not have with our customers," the chip giant said in an e-mailed statement.

Oracle also claimed that HP had kept secret from the market, but revealed in a filing two days previously, that HP and Intel have a contractual commitment that Itanium will continue through the next two generations of microprocessors.

HP's strategy behind its "false statements" about Intel's support for Itanium was to take away business from Oracle Sun, and "reap lucrative revenues from the locked-in Itanium customer base using HP's HP-UX operating system on Itanium servers", as the company gets few service contracts on operating systems like Linux that run on x86 processors, Oracle said.

Oracle acquired Sun Microsystems last year.

HP filed a suit in June over Oracle's decision to stop developing software for the Itanium processor, the chip used in HP's high-end servers, claiming that Oracle's decision violates "legally binding commitments" that it made to HP and the companies' 140,000 joint customers.

Oracle said at the time that HP tricked it into signing an agreement last September to continue its support for Itanium, even though HP knew of an Intel plan to discontinue Itanium. HP already knew all about Intel's plans to discontinue Itanium, and HP was concerned about what would happen when Oracle found out about that plan, Oracle said in a statement in June.

As Oracle well knows, HP and Intel have a contractual commitment to continue to sell mission-critical Itanium processors to customers through the next two generations of microprocessors, thus ensuring the availability of Itanium through at least the end of the decade, HP said in a statement.

"The fact remains that Oracle's decision to cut off support for Itanium was an illicit business strategy it conjured to try to force Itanium customers into buying Sun servers -- and destroy choice in the marketplace," HP said.

Monday, November 21, 2011

Commonhold And Leasehold Reform Right To Manage Essentials

If you own a flat and the management of the shared parts of the development is under the control of your freeholder, then you can take that control away. The law allows you to follow a detailed process and set up your own body which has responsibility for things such as cleaning the windows, cleaning the hallways, garden maintenance, roof repairs, electricity in the shared areas and many other amenities.

Cisco CCNA Training, Cisco CCNA Certification

Best CCIE Training and CCIE Exams and more Cisco exams log in to Certkingdom.com



Why would anyone want to do that? Well under The Commonhold and Leasehold Reform Act 2002 you do not need to tell your freeholder a reason. It is thus called a 'no-fault' right and the process is called Right to Manage. As such, there is no need to prove negligence by the organisation in charge of the management -you can go ahead and do it.

The maintenance and other elements of the shared parts of the development are usually looked after by a company referred to as a managing agent. It is important to distinguish between this and a residents' management company or management company. The managing agent will be a third party company that is contracted to look after agreed aspects of the management. They are usually contracted by the development's management company.


The operating models do vary and some smaller blocks handle the maintenance themselves and share the jobs between the flat owners. It is agreed who will liaise with the contractors and how they will be paid. Or even some work such as the garden maintenance might be undertaken by the leaseholders themselves, although the lay person may not know the full legal obligations of management that still need to be observed.

Most blocks will need a managing agent who specialises in negotiation with suppliers, handling compliance, scheduling works, quality control and the many varied aspects of looking after a large building and its grounds.

The motivators for making the big change and exercising the Right to Manage can be drawn from a number of options. These include the perception that you are being overcharged. It is very common for leaseholders to feel that they are not getting value for money from their managing agent. This is especially true if the managing agent is either controlled by or even owned by the freeholder of the development. Some might argue this leads to a conflict of interest, particularly if the freeholder might benefit financially from the maintenance charges imposed on leaseholders.


Even if the managing agent is independent of the freeholder - although still appointed by them - in many cases lessees cannot change the provider. RTM changes all that. Where quality of service issues are identified, flat owners can now determined their own destiny and appoint a managing agent that may improved the standard of gardening, for example.

If there is friction between owners and the freeholder, unfortunately RTM won't solve this. The reason is that the freeholder has a right to have membership of the RTM company, so they will still need to be consulted and involved in all aspects of the development's maintenance.

Communication between all parties is key. However poor your existing managing agent, recognise that a huge amount of consultation between the contractors, short-lease tenants, freeholders and managing agents needs to constantly take place. It is very time consuming, emotionally draining and demands good organisational and negotiation skills. Additionally, the lessees involved in liaising with the managing agent will change as the ownership of the flats change, so there is ever-changing needs and demands. Whether you decide to exercise your Right to Manage or not, the important thing is to weigh up the options and understand all the implications. Speak to your local managing agents, especially if they are members of professional groups, and get their advice.

Saturday, November 19, 2011

Security pros seek hacking, forensics skills

IT professionals looking to boost their high-tech careers in the coming five years are betting on security certifications and skills to help them stand out to potential employees, according to a new survey.

MCTS Certification, MCITP Certification

Best comptia A+ Training, Comptia A+ Certification at examkingdom.com



Five ways to get affordable certification skillsSome IT skills see pay hikes during the downturn

CompTIA, an IT industry trade association, polled some 1,537 high-tech workers and found 37% intend to pursue a security certification over the next five years. Separately, nearly 20% indicated they would seek ethical hacking certification over the same time period. And another 13% pinpointed forensics as the next certification goal in their career development.

"When you add the results, you will see that about two-thirds of IT workers intend to add some type of security certification to their portfolio," says Terry Erdle, senior vice president of skills certifications. "This trend is driven by two factors: one, security issues are pervasive, and two, more and more people are moving to managed services and software-as-a-service models, which involves more complex networking. That level of non-enterprise data center computing has people look more closely at their security infrastructure."

High-tech workers surveyed cited economic advancement and personal growth as the motivation to seek further certifications. Nearly 90% said they want to spruce up their resumes and another 88% said they hope to grow personally with new certifications. Emerging technologies and vertical industry trends also drive certifications seekers. For instance, SaaS ranked among the technologies in which IT workers intend to seek certifications in the coming years. Green IT, mobile and healthcare IT also placed among high-tech career development plans.

"We are going to see upwards of 70,000 IT jobs in healthcare and the related network and storage skills that come with electronic records, such as e-reporting and e-charting," Erdle adds. "We are working now to determine what kind of IT roles should be supported in certifications from CompTIA."

Wednesday, November 16, 2011

Exascale computing seen in this decade

There is almost an obsessive focus at the supercomputing conference here on reaching exascale computing, a level of computing power that is roughly 1,000 times more powerful than anything that is running today, in this decade.

MCTS Certification, MCITP Certification

Best HP Certification Training and HP Exams Training and more Cisco exams log in to examkingdom.com



In the lives of most people, something that is eight or nine years off may seem like a long time, but at SC11, it feels as if it is just around the corner. Part of the push is coming from the U.S. Department of Energy, which will fund these massive systems. The DOE told the industry this summer that it wants an exascale system delivered in the 2019-2020 timeframe that won't use more than 20 MW of power. The government has been seeking proposals about how to achieve this.

To put 20 MW of power in perspective, consider the supercomputer that IBM is building for the DOE's Lawrence Livermore National Laboratory. This system will be capable of speeds of 20 petaflops. It will be one of the largest supercomputers in the world as well as one of the most energy efficient. But when it is completely turned on next year, it will still use somewhere in the range of 7 to 8 MW of power, according to IBM. An exascale system has the compute power of 1,000 petaflops. (A petaflop is a quadrillion floating-point operations per second.)

"We're in a power constrained world now," said Steve Scott, the CTO of Nvidia Corp.'s Telsa business, "where the performance we can get on a chip is constrained not by the number of transistors we can put on a chip, but rather by the power."

Scott sees x86 computing processing limited by its overhead processes. GPU processors, in contrast, provide throughput with very little overhead, and with less energy per operation.

Nvidia has been building HPC (high-performance computing) systems with its GPUs and CPUs, often enough, from Advanced Micro Devices. This hybrid approach is also moving toward ARM processors, widely used in cell phones, which may lead to an integrated GPU and ARM hybrid processor.

Scott believes the DOE's 20 MW goal can be achieved by 2022. But if the government's exascale program comes through with funding, it may enable Nvidia to be more aggressive in circuit and architectural techniques, making it possible to achieve that power level goal by 2019.

Scott said reaching that level of efficiency will require improving power usage by 50 times.

While 20 MW may seem like a lot of power, Scott points out that there are cloud computing facilities that require as much as 100 MW of power.

Rajeeb Hazra, general manager of Technical Computing at Intel, said his company plans to meet the exascale, 20 MW goal by 2018, one year ahead of the U.S. government's expectation. He made this remark during the announcement of the company's unveiling of the Knights Corner, its new 50-core processor that's capable of one teraflop of sustained performance.

While the hardware makers deal with power and performance issues, exascale, as is petaflop computing, is providing HPC users with challenges in scaling codes to fully use these systems.

Before reaching exascale, vendors will produce systems that can scale into the hundreds of petaflops. IBM, for instance, says the new system Blue Gene/Q will be capable of 100 petaflops.

Kim Cupps, the computing division leader and Sequoia project manager at Lawrence Livermore, will be happy with 20 petaflops.

"We're thrilled to have to have this machine so close to our grasp," said Cupps, or her 20 petaflop system. "We are going to solve many problems of national importance, ranging from material's modeling, weapons science, climate change and energy modeling."

Of IBM's claim that its system can scale to 100 petaflops, "that's IBM saying that," said Cupps, "I'll vouch for 20."

Tuesday, November 15, 2011

Firewalls fail to stem tide of DDoS attacks, survey finds

Companies still rely heavily on firewalls to defend themselves against denial-of-service attacks despite the fact that this class of device is often not up to the task, a new survey by F5 Networks has found.

MCTS Certification, MCITP Certification

Microsoft Avaya Exam Training , Avaya Certification and over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com


The survey of 1,000 medium and large organisations in 10 countries found that up to 45 percent of respondents experience such attacks on a regular basis, a mixture of application and network-layer incursions.

About half rated denial of service attacks as highly effective with 79 percent saying they still relied on firewalls to deflect them despite 42 percent finding that such devices were ineffective against conventional attacks at the network layer.

The research also found that nearly half had detected attempts to access encrypted data on networks, with 44 percent noticing attacks against DNS servers, one of the most difficult-to-defend assets.

"Whilst many organisations can view insider threats as the most difficult to defend against, the research clearly demonstrates that external threats remain a potent force, and companies need to be aware of the most effective ways to safeguard themselves," said F5's technical director, Gary Newel.

Attacks on DNS servers were a clear worry, rated as being in the top three hard to repel attacks by a third of those asked.

"IT managers are between a rock and a hard place as attacks become more sophisticated and the cost of a breach continues to rise," said Newel.

The anxiety over DDoS attacks is far from new although exactly how to defend against it, not surprisingly, divides vendors.

Some see the best solution as being better routing infrastructure because routers are the first to handle DDoS packets as they move into a network. F5 is out to push its Big-IP Application Delivery Controllers which act in effect as load-balancing application firewalls. Another option is to use multiple layers and bundle the hardware level as a service.

During the recent launch of the Technology Operations Centre for the 2012 Olympic Games in London, organisers touted an array of security measures to counter the menace of a large DDoS disrupting content distribution from the global event.

Saturday, November 12, 2011

HP, Dell pass Nokia in new Greenpeace green IT guide

While some consumer electronics manufacturers have cleaned up their act, making more energy-efficient gadgets with fewer toxic materials, others are continuing to make fine promises but no changes, according to Greenpeace. In response, the environmental pressure group is changing the way it scores companies in its Guide to Greener Electronics, placing more emphasis on their actions than their words, and measuring new aspects of their operations.

Cisco CCNA Training, Cisco CCNA Certification

Best HP Certification Training and HP Exams Training and more Cisco exams log in to Certkingdom.com



The latest guide to greener PCs, TVs and mobile phones recognizes the efforts Hewlett-Packard and Dell have made to clean up their supply chain, catapulting them up the rankings, while Nokia has slipped back to third place after leading since September 2008. BlackBerry maker Research In Motion came in last place.

HP scored 5.9 out of 10, the energy efficiency of its products and its avoidance of hazardous substances putting it well ahead of second-placed Dell, with 5.1 points, Nokia with 4.9 and Apple with 4.6.

Greenpeace rated Nokia's products more energy efficient than HP's, but gave the PC manufacturer bonus points for a range of sustainable operations criteria that haven't appeared in the rankings before.

For instance, HP scored well for its advocacy of clean energy policy, its clean electricity plan and its clear policy in favor of sustainably sourced paper and against deforestation -- something still missing at Nokia, which began life as a paper manufacturer.

"The original guide criteria were created in 2006, and there's been a lot of progress on toxics phaseouts since then," said Greenpeace senior campaigner Tom Dowdall.

The Greenpeace guide originally set out to encourage manufacturers to stop using materials such as brominated flame retardants (BFRs) and PVC insulation in their products, and many manufacturers did so between 2008 and 2010.

Companies such as HP, Apple and Acer were particularly quick to react, said Dowdall. Apple's products are now BFR-free and the company uses no PVC, except where required by local safety regulations on power cords, according to the latest edition of the guide.

While the new rankings still measure the energy consumption and PVC and BFR content of finished products, Greenpeace is now targeting the way they are made, asking companies to report on the greenhouse gas emissions and type of energy used by the factories that make them, and even the suppliers of the components and raw materials that feed the assembly lines.

"If you haven't measured it, you can't reduce it. Measurement and disclosure are the basics of any target-setting," Dowdall said.

Greenpeace has changed the scope of the guide in other ways. Games console manufacturers Microsoft and Nintendo are no longer ranked, and neither are PC maker Fujitsu nor phone manufacturer Motorola Mobility. The report now covers 15 major manufacturers of PCs, TVs and mobile phones: HP (5.9 points out of 10), Dell (5.1), Nokia (4.9), Apple (4.6), Philips (4.5), Sony Ericsson (4.2), Samsung Electronics (4.1), Lenovo (3.8), Panasonic (3.6), Sony (3.6), Sharp (3.0), Acer (2.9), LG Electronics and Toshiba (both 2.8), and Research in Motion (1.6). A full breakdown of the scores can be found on the Greenpeace website.

Greenpeace praised all but the last four companies for either their product energy efficiency, the disclosure of their own greenhouse gas emissions, or both. HP and Dell were also praised for their sustainable sourcing of paper products, Nokia and Apple for their voluntary take-back programs in countries without electronic product recycling laws, and Sony Ericsson for its chemicals management and advocacy.

Sony was criticized (and lost a point) for opposing legislation in California to make battery chargers and appliances more energy efficient.

Twelve of the companies (all save Acer, LG and Sony Ericsson) were warned that they too will lose a point next year if they continue to support industry bodies that have also opposed the energy efficiency legislation.

Thursday, November 10, 2011

Microsoft patches critical Windows 7 bug, downplays exploit threat

Microsoft today delivered four security updates that patched four vulnerabilities in Windows, most of them affecting the newer editions of Vista and Windows 7.

MCTS Certification, MCITP Certification
Cisco CCNA Training, Cisco CCNA Certification 2000+ Exams at Examkingdom.com



Only one of the updates was marked "critical," Microsoft's most-serious threat ranking. Two of the remaining were labeled "important" and the fourth was tagged as "moderate."

As expected, Microsoft did not patch the Windows kernel vulnerability exploited by the Duqu campaign.

Top on Microsoft's chart today -- and on outside researchers' to-do lists as well -- was the MS10-083 update that patches a bug in Windows Vista's, Windows 7's and Server 2008's TCP/IP stack, which regulates Internet connections.

The vulnerability could be used by attackers in certain circumstances to hijack an unpatched PC, said Microsoft, which nevertheless downplayed the likelihood of successful attacks.

"This critical bug allows an attack via the network, and looks troublesome at first glance," said Andrew Storms, director of security operations at nCircle Security. "But it doesn't look very easy to pull off, so in this case, it's not as big a concern as one would think."

Storms pointed to a post by Microsoft engineers on the Security Research & Defense blog that spelled out the necessary conditions for an effective attack.

"We believe it is difficult to achieve [remote code execution] using this vulnerability considering that the type of network packets required are normally filtered at the perimeter and the small timing window ... and [that] a large number of packets are required to pull off the attack," wrote Ali Rahbar and Mark Wodrich of the Microsoft Security Response Center (MSRC).

Microsoft gave the vulnerability an exploitability index rating of "2," meaning that it expects only unreliable exploit code to appear in the next 30 days.

Even so, some researchers warned that if criminals focused their attention on the bug, they may be able to craft a consistent exploit that could be used to launch worm-based attacks.

Microsoft also updated Windows Mail and Windows Meeting Space on Vista, Windows 7 and Server 2008 to fix yet another "DLL load hijacking" vulnerability.

DLL load hijacking, sometimes called "binary pre-loading," describes a class of bugs first revealed in August 2010. Microsoft has been patching its software to fix the problem -- which can be exploited by tricking an application into loading a malicious file with the same name as a required dynamic link library, or DLL -- since last November.

Today's MS11-085 update was the eighteenth Microsoft has issued to fix DLL load-hijacking vulnerabilities in its software.

"They're a dime-a-dozen these days," said Storms of the latest in the long-running series.

Researchers also noted that while Microsoft did not patch the Duqu-exploited bug, it did fix a different flaw in the TrueType font parsing engine, the component targeted by the Trojan's attacks.

MS11-084 fixes a single vulnerability in the Windows kernel-mode driver "Win32k.sys" that can be exploited through a malformed TrueType font file.

"We're see a pattern of kernel-level bugs and parsing of font files," said Storms. "And they're going to have to come back and patch this again for Duqu."

Microsoft patched the TrueType engine within Win32k.sys just last month, fixing a flaw that let hackers conduct denial-of-service attacks to cripple Windows PCs. Today's bug was also categorized as a denial-of-service flaw.

In lieu of a fix, Microsoft last week told customers that they could defend their systems by blocking access to "t2embed.dll," the dynamic link library that handles embedded TrueType fonts.

An advisory offered command-prompt strings IT administrators can use to deny access to t2embed.dll, and links to one of Microsoft's "Fix-it" tools that automate the process of blocking or unblocking access to the library.

Blocking t2embed.dll, however, has side effects: Applications, including Web browsers, applications in Microsoft's Office suite and Adobe's Reader, may not render text properly.

Microsoft also updated that advisory today with a link to a list of its antivirus partners that have issued signatures to detect the kernel-based Duqu attacks.

November's security patches can be downloaded and installed via the Microsoft Update and Windows Update services, as well as through Windows Server Update Services.

BlackBerry finds safe haven in UK, despite a fall in sales

While BlackBerry sales continues to suffer across the U.S and Canada, the smartphone has found safe haven in the UK, of all places.

MCTS Certification, MCITP Certification

Microsoft MCTS Certification, MCITP Certification and over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com


BlackBerrys, despite their once popular status across the U.S., Canada and Europe, have found a safe haven in exile in the United Kingdom, of all places.

It has been only a month since the BlackBerry global outage, which spread across four days and affected over half its worldwide user base, an estimated 50 million users. The press alone ripped a hole in Research in Motion’s communications strategy, which left BlackBerry users in some cases more frustrated than the outage itself.

But despite the odds, BlackBerrys retained high status and stormed ahead of both Nokia and Apple in the UK market, despite shipments declining 2 percent, as Apple’s iPhone fell by just over a quarter.

On a bright side, Samsung jumped 178 percent to 1.1 million shipments, and HTC grew by nearly half to 800,000 shipments.

While Nokia’s sales slumped the most by 87 percent to a meagre 130,000 shipments, Sony Ericsson fell 32 percent to just 300,000 shipments, after it was announced last month that Sony would buy out Ericsson’s share to take the venture into its wholly-owned arms.

Nokia hopes that, as many consider the company’s Symbian mobile operating system of choice is outdated or “perceived as dead”, Canalys senior analyst Tim Shepherd said, the Finnish phone giant is hoping to regroup with its Microsoft partnership as it rolls out Windows powered phones.

One of the reasons for Apple’s loss could be the lacking availability of the new iPhone 4S smartphone, with mobile phone contract renewal or upgrade dates falling at an inconvenient time. It is thought however that the Christmas holiday period will boost smartphone sales.

Plus, as many had hoped that the next-generation iPhone 5 would be out in September, many are still holding out for the model next year, skipping the iPhone 4S altogether.

Statistics from analytics firm Canalys show smartphone shipments declined by 7 percent to 5.3 million, with a mediocre performance across the major manufacturers.

Research in Motions’ marketshare has fallen below 10 percent, according to Shepherd, noting its 58 percent year-over-year drop in North America.

But in areas of high wealth, the United Arab Emirates for example showed a massive strong quarter of growth, up 181 percent to 1.4 million units shipped in the region.

But the outage alone did not deter loyal followers of the BlackBerry brand. The popular BlackBerry Messenger service was still a unique selling point for particularly the younger generation across Europe, though now rivalled by Apple’s iMessage service, which offers vastly the same function amongst iPhone and iOS 5 users.

Tuesday, November 8, 2011

Get Microsoft 70-680 Exam Video Training

Get Microsoft 70-680 Exam Video training tips straight from the front lines with 70-680 Exam Video dumps. With 100% free dump access to literally thousands of brain dumps and special 70-680 Exam Video braindumps.


Best Microsoft MCTS Certification, Microsoft MCITP Training at certkingdom.com


You get quality MCTS 70-680 Exam Video dumps from reliable IT community members who have already passed their 70-680 Exam Video and are using Microsoft technologies on a daily basis. Use their free brain dumps to advance your own career and then help those who come behind you by creating a new 70-680 questions and answers sample tests with what you find most valuable as a successful and passing Microsoft 70-680 Exam Video student.

Join the free braindump and MCTS 70-680 Exam Video braindump community by contributing to existing 70-680 dumps by suggesting explanations, possible solutions and answers, and of course by submitting your own content to be used by others. Who knows, your 70-680 Exam Video dump could make it into one of our 70-680 Master Dumps!

One of the biggest names in IT, Microsoft has a number of certifications for IT minded students. Among these MCTS certifications 70-680 Exam Video is considered to be one of the most professional and highly demanding certification. This one is a composite exam and needs Microsoft 70-680 notes and TS: Windows 7, Configuring for practicing. The function of the Microsoft 70-680 test is to analyze the student's knowledge and skill about the essential Microsoft core skills. Being a little tough and technical in nature most of the students find it difficult hence using braindumps to pass Microsoft 70-680 Exam Video questions. Mainly the complexities in this are various network types, connecting to Microsoft 70-680 labs WAN, network security, Microsoft pass 70-680 routing and switching. To cover all these hard to understand topics >MCITP: Enterprise Desktop Support Technician 7 are easily available on website.

Qualifying for the Microsoft 70-680 Exam Video papers means that you have accomplished something major in your IT career that can take you to new levels of success. Every individual wants a successful career and especially in IT this is the certification that you need to accomplish your goals. Here is a guideline for you that what sort of content will be in MCTS 70-680 study materials. Describing the purpose and functions of the technical aspects of the Microsoft 70-680 lab will be a significant part of your preparation..

In order to learn these things you will need to study Microsoft 70-680 latest dumps, as these are designed by IT experts including each and every Microsoft 70-680 practice question that can appear in exam. This professionally designed MCTS 70-680 free dumps contains all the past exam questions, recurring questions and those that seldom appear in exams. You will see that Microsoft 70-680 certkingdom dumps cover the Microsoft 70-680 video configuration, verifying, and troubleshooting Microsoft 70-680 quiz issues successfully. Performing fundamental Microsoft tasks and troubleshooting become easier through prepping with the Microsoft 70-680 practice exams brain dump. Training with braindumps will reduce the overall 70-680 cost due to the lower fees and prices associated with these tools. Other than this, Microsoft 70-680 braindumps will provide you information found only in MCTS 70-680 cbt media. After downloading your free Microsoft 70-680 dumps, preparing for your test will become less of a burden. All the content may sound you difficult but the special design of these certkingdom Microsoft 70-680 dumps is very helpful; Focusing on each and every TS: Windows 7, Configuring questions that appeared in Microsoft's exam history.
The 70-680 Exam Video has been designed for professionals who analyze the business requirements. The autor devote herself to research the problems and knowledge of MCSE Certification.If you have any questions about MCSE,you can comments on the article the autor publiced.

Friday, November 4, 2011

Microsoft's Roslyn: Reinventing the compiler as we know it

Whatever you may think of its business practices, Microsoft has always been top-notch when it comes to developer tools. Visual Studio is widely hailed as the best IDE out there, and .Net is an intelligently designed platform that borrows the best of what Java has to offer and takes it a few steps further.

MCTS Certification, MCITP Certification

Microsoft MCTS Certification, MCITP Certification and over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com


No doubt that's why so many developers got nervous when Microsoft started touting JavaScript and HTML5 as application development tools earlier this year. Those fears were compounded at this year's Build conference (née PDC), when all the buzz seemed to be about Metro-style apps rather than traditional desktop software. Some developers even worried that the new emphasis on Web technologies and the new WinRT APIs meant Microsoft was forsaking .Net altogether.


Nothing could be further from the truth. Looking past the Metro hype, the Build conference also revealed promising road maps for C#, Visual Studio, and the .Net platform as a whole.

Perhaps the most exciting demo of the conference for .Net developers, however, was Project Roslyn, a new technology that Microsoft made available yesterday as a Community Technology Preview (CTP). Roslyn aims to bring powerful new features to C#, Visual Basic, and Visual Studio, but it's really much more than that. If it succeeds, it will reinvent how we view compilers and compiled languages altogether.

Deconstructing the compiler
Roslyn has been described as "compiler-as-a-service technology," a term that's caused a lot of confusion. I've even seen headlines heralding the project as "Microsoft's cloud compiler service" or "bringing .Net to the cloud." None of that is correct. Technically, it would be possible to offer code compilation as a cloud-based service, but it's hard to see the advantage, except in special circumstances.

Roslyn isn't services in the sense of software-as-a-service (SaaS), platform-as-a-service (PaaS), or similar cloud offerings. Rather, it's services in the sense of Windows services. Roslyn is a complete reengineering of Microsoft's .Net compiler toolchain in a new way, such that each phase of the code compilation process is exposed as a service that can be consumed by other applications.

As Microsoft's Anders Hejlsberg explained in a Build conference session, "Traditionally, a compiler is just sort of a black box. On one side you feed it source files, magic happens, and out the other end comes object files, or assemblies, or whatever the output format is."

Internally, however, there's a lot more going on. Typically, first the compiler parses your source code and breaks it down into a syntax tree. Then it builds a list of all the symbols in your program. Then it begins binding the symbols with the appropriate objects and so on.

An ordinary compiler discards all of this intermediate information once the final code is output. But with Roslyn-enabled compilers, the data from each step is accessible via its own .Net APIs. For example, a call to one API will return the entire syntax tree of a given piece of code as an object. A call to another API might return the number of methods in the code.

So what is Roslyn good for?
The most obvious advantage of this kind of "deconstructed" compiler is that it allows the entire compile-execute process to be invoked from within .Net applications. Hejlsberg demonstrated a C# program that passed a few code snippets to the C# compiler as strings; the compiler returned the resulting IL assembly code as an object, which was then passed to the Common Language Runtime (CLR) for execution. Voilà! With Roslyn, C# gains a dynamic language's ability to generate and invoke code at runtime.

Put that same code into a loop that accepts input from the user, and you've created a fully interactive read-eval-print loop (REPL) console for C#, allowing you to manipulate and experiment with .Net APIs and objects in real time. With the Roslyn technology, C# may still be a compiled language, but it effectively gains all the flexibility and expressiveness that dynamic languages such as Python and Ruby have to offer.

The separate phases of the compilation process have their uses, too. For example, according to a blog post by Microsoft's Eric Lippert (Silverlight required), various groups have written their own C# language parsers, even within Microsoft. Maybe the Visual Studio team needed to write a syntax-coloring component, or maybe another group wanted to translate C# code into something else. In the past, each team would write its own parser, of varying quality. With Roslyn, they can simply access the compiler's own syntax parser via an API and get back a syntax tree that's exactly the same as what the compiler would use. (Roslyn even exposes a syntax-coloring API.)

The syntax and binding data exposed by the Roslyn APIs also makes code refactoring easier. It even allows developers to write their own code refactoring algorithms in addition to the ones that ship with Visual Studio.

Hejlsberg's most remarkable demo, however, showed how Roslyn's syntax tree APIs make it remarkably easy to translate source code from one CLR language to another. To illustrate, Hejlsberg copied some Visual Basic source code to the clipboard, opened a new file, and chose Paste as C#. The result was the same algorithm, only now written in C#. Translations back and forth don't yield identical code -- for loops might translate into, say, while loops -- but in all cases the code was perfectly valid, ready to compile, execute, or refactor.

Can I have it now, please?
The catch: Hejlsberg wouldn't commit to a ship date for the Roslyn technologies or even that they'd make it into a shipping Visual Studio release. For that matter, he wouldn't comment on any future Visual Studio releases or whether there would be another version at all. Even the Roslyn CTP release is running a little late. At the Build conference running Sept. 13 to 16, Hejlsberg said it would arrive "in four weeks." It arrived yesterday -- a week late -- instead.

Don't think Roslyn is too far-fetched to happen, though. It's actually very similar to the Mono project's Mono.CSharp library, which exposes the Mono C# compiler as a service and enables a REPL console much like the one Hejlsberg demoed at Build. Mono.CSharp has been shipping with Mono since version 2.2.

The main drawback of Roslyn is that it's a complete retooling of the .Net compilers, rather than of the platform itself. That means it's limited to C# and Visual Basic, at least for its initial release. If developers using other .Net languages want to take advantage of Roslyn-like capabilities, those languages' compilers will need to be completely rewritten.

But maybe they should be. If Microsoft succeeds with everything it has planned, Roslyn represents not merely a new iteration of the Visual Studio toolchain but a whole new way for developers to interact with their tools. It breaks down the barriers between compiled and dynamic languages and enables powerful new interactive capabilities in the coding process itself. It truly is one of the most ambitious and exciting innovations in compiler technology in a long time.

2 million iOS 5 users choose Hotmail

Number is growing by some 100,000 every day.

MCTS Certification, MCITP Certification

Microsoft MCTS Certification, MCITP Certification and over 2000+
Exams with Life Time Access Membership at http://www.actualkey.com


According to Microsoft, some 2 million iOS 5 devices are connected to Hotmail, and that number is growing by some 100,000 every day.

With the release of iOS 5 it became easier for user to configure their iDevice to send and receive email via Hotmail because Apple added it as a default option in the Add Account … screen. And it seems that users have embraced this option enthusiastically.


40% of those connecting to Hotmail use an iPhone 4, but remarkably a massive 24% are using iPhone 4S handsets. This is particularly amazing considering that the 4S has only been available for three weeks.

Thursday, November 3, 2011

Parallels update offers new ways to install Lion and Windows

If you plan on running multiple operating systems on your Mac, one route you can take besides a direct installation like Windows in Boot Camp is to use a virtual machine, which installs the OS within OS X so it and its applications will run alongside your OS X applications.



Best comptia A+ Training, Comptia A+ Certification at Certkingdom.com



There are several virtualization options for OS X, including VMware Fusion and Parallels Desktop, both of which offer robust solutions for running multiple operating systems that integrate the guest operating system well with the Mac OS. Recently, Parallels released an update to its latest version of Parallels Desktop that, in addition to a round of bug fixes, includes new options for installing and managing operating systems.

In Parallels Desktop 7, the new Wizard interface for setting up virtual machines has a Convenience Store feature for purchasing copies of Windows, in addition to direct links for downloading and installing other popular operating systems such as Ubuntu, Chrome, and Fedora, and even installing OS X Lion using its Recovery HD partition.

With the latest update, the Parallels Wizard now includes a quick way to access and install the latest Windows 8 developer preview in a virtual machine so you can test out Microsoft's latest OS. In addition, the update also provides a way to install OS X Lion directly from the Lion installation application that you download from the Mac App Store. While you could previously install Lion from the Mac App Store download, you first needed to open the installer package and access the InstallESD image file directly. Now you just need to select the installer application to install Lion.

While it may seem a bit odd to install Lion within Lion, in some instances it may be a useful thing to do, for example if you wish to test a software package before installing it in your main OS. Sequestering the software on a virtual machine will help you see how it installs and how it may run, and if a problem occurs you can easily remove the virtual machine and set it up again.

Tuesday, November 1, 2011

Study: User tools to limit ad tracking are clunky

People who want to limit the behavioral advertising and tracking they are subjected to on the Web aren't well served by some popular privacy tools, according to a Carnegie Mellon University study.

MCTS Certification, MCITP Certification
Cisco CCNA Training, Cisco CCNA Certification 2000+ Exams at Examkingdom.com



Researchers concluded that the tools evaluated in the study, which included IE and Firefox components, were generally too complicated and confusing, leading people to misuse them.

"We found serious usability flaws in all nine tools we examined," reads the 38-page report, released on Monday.

The nine tools fall into three main categories: tools that block access to advertising websites; tools that create cookies that indicate users want to opt out of behavioral advertising; and privacy tools built into web browsers.

The researchers enlisted 45 people to try out the tools. The participants weren't technical experts, nor were they knowledgeable about privacy tools, but did have an interest in this type of tools.

Each tool was tried out by five participants. Researchers observed how participants installed and configured the tools, and recorded the users' perceptions and opinions.

"None of the nine tools we tested empowered study participants to effectively control tracking and behavioral advertising according to their personal preferences," the researchers wrote.

Tools for creating opt-out cookies give users a laundry list of ad networks with little or no additional information for users to decide which ones to block.

As a result, users generally opted to block all ad network trackers instead of making informed decisions on a per-company basis, the researchers found.

Another problem: the default settings for most tools were "inappropriate" because they come out-of-the-box with most protections turned off, putting the onus on users to activate and configure them.

A related issue is that the tools do a poor job of explaining to users how they work and how they should be configured, presenting information in terms that were either too simplistic or too technical.

And once configured, the tools didn't clearly communicate to users what they were doing, particularly when they blocked specific content and functionality in websites users were visiting.

The design of the user interfaces also contributed to the users' confusion and inability to properly use the tools, according to the study.

"Our results suggest that the current approach for advertising industry self-regulation through opt-out mechanisms is fundamentally flawed," the researchers wrote.

The tools evaluated by the study are DAA Consumer Choice from the Digital Advertising Alliance; Global Opt-Out and Ghostery 2.5.3, both from Evidon; Privacy Choice's PrivacyMark; TACO 4.0 from Abine; Adblock Plus 1.3.9; Mozilla Firefox 5's privacy panel; and Microsoft IE9's privacy controls and Tracking Protection mechanism.

Rob Shavell, co-founder of Abine, agrees with the researchers' general conclusion.

"People need easier-to-use tools," said Shavell, adding that Abine is working hard to make TACO (Targeted Advertising Cookie Opt-out) simpler.

"We'll get there. We'll create an awesome product," he said in a phone interview. "We're making it easier every day."

There is an ongoing discussion among privacy software vendors, online advertising providers and government regulators about simplifying tools and processes for consumers so that they can more easily control online behavioral ad tracking, such as through the proposed Do Not Track standard, Shavell said.

Complicating matters is that interest among consumers in tools that offer control over online tracking is fairly new, so users aren't generally familiar with this type of software, he said.

A spokeswoman for Microsoft said the Tracking Protection feature in IE9 lets users add "an industry curated" tracking protection list with one click.

"Tracking Protection Lists offers consumers an opt-in mechanism to identify and block many forms of undesired tracking," she said via e-mail.

These lists, compiled by third-party organizations with expertise in this field, let users control which third-party site content can track them when they're online, and they can be designed to either "block" or "allow" certain third party content, she said.

"IE9 also includes the broadly discussed Do Not Track User Preference -- via both a DOM property and an HTTP header, as described in the W3C submission -- as a secondary method," she added.

Scott Meyer, founder and CEO of Evidon, said his company's products aim to create "transparency," not just opt-out mechanisms.

"Transparency enables consumers to make more informed decisions, and the fact is that the vast majority choose not to opt-out when presented with more information," he said in an e-mailed statement. "Our research ... shows that 67 percent of consumers feel more positive about brands which give them this level of transparency and control."

Jim Brock, Privacy Choice's CEO and founder, said independent tests such as the one from Carnegie Mellon are very valuable, even if they are subjective, and Privacy Choice will take the findings into account.

However, Privacy Choice's principal blocking service for consumers is TrackerBlock, which the Carnegie Mellon researchers didn't include in the study, and not PrivacyMark, he said via e-mail.

In addition, internal surveys at Privacy Choice reveal that more than 75 percent of PrivacyMark users understand what the service does and would recommend it to a friend, he said.