Showing posts with label Uncategorized. Show all posts
Showing posts with label Uncategorized. Show all posts

Tuesday, January 29, 2013

Startup NetCitadel aims to orchestrate security management controls in virtualized nets


Startup NetCitadel today launched with a product called OneControl intended to automate what might otherwise be manual research and changes related to configuring firewalls, switches or other gear when virtual-machine (VM) workloads are spun up or down in enterprise data centers or cloud environments.
"We're helping enterprises go from manual processing that's time-consuming to show automated responses to network events," says Mike Horn, co-founder and CEO of NetCitadel, about the purpose of the OneControl virtual appliance. Used in data centers, it can automate determinations about firewall, router and switch settings based on the preferred corporate security policy relative to VM-based workloads, eliminating the need for an administrator to manually research it.
Horn says OneControl can be installed to work with the various VM platforms, including VMware, Xen and Hyper-V. In a VMware-based environment, it can work with VMware's vDirector and vCloud APIs "to map the intelligence of the virtual device," says Horn, noting OneControl keeps track of the VM resource pool and related information such as IP addresses to determine what changes might need to be made to network firewalls, switches or routers to conform to security policy.
Available for about $25,000, the product competes against similar security-policy management and orchestration offerings from Cisco and Juniper. The idea is when VM workloads are moved around, OneControl can immediately advise on changes that need to be made to gear that today includes certain Cisco and Juniper routers, switches, firewalls and security gateways. A typical question it's designed to answer is, "If vMotion happens inside a network, how does that impact firewall devices?" says Horn. In the future, NetCitadel plans to bring intelligence about other gear, such as load balancers, into the equation as well.
OneControl can be deployed in either the enterprise network or cloud services, though the main testing so far has been toward supporting the Amazon AWS cloud, says Horn.
OneControl has been in early adoption for about five months at Kenettek, the Broken Arrow, Okla.-based managed services and data center provider which serves the oil and gas industry, among others. Almost the entire Kenettek data center is virtualized, says Ken Dobbins, service manager there, noting that OneControl is saving a huge amount of time in configuring services in routers and firewalls when new VM server clusters are spun up or otherwise changed for customers.
OneControl immediately provides security-policy directions to the Kenettek help desk staff rather than requiring they research how the VM-based change will impact security policy-based configurations related to firewalls and routers. This not only saves a huge amount of time, but it's turning out that it also saving on VMware licensing charges which are now based on "committed RAM per hour," says Dobbins. In the energy sector where data related to SCADA controls is collected during certain peak hours, it makes a difference, he points out.
NetCitadel, based in Mountain View, was founded in 2010 by Horn with Theron Tock, CTO, and Vadim Kurland. The company has received an undisclosed amount of funding from New Enterprise Associates.
Ellen Messmer is senior editor at Network World, an IDG publication and website, where she covers news and technology trends related to information security. Twitter: @MessmerE. Email: emessmer@nww.com.
Read more about wide area network in Network World's Wide Area Network section.

Read more »

Friday, January 18, 2013

Tech hotshots: The rise of the UX expert


Roberto Masiero vividly remembers the moment in 2011 when it became clear to him that designing a mobile application was a considerably different effort than designing a desktop application.
As head of the innovation labs for ADP, the $10 billion payroll services firm, he managed the engineering team tasked with creating ADP Mobile, the company's version of its human capital management application for mobile devices.
"We started out with a list of 100 features that we thought were awesome," Masiero remembers, but his team's enthusiasm ran smack into the collective disdain of the user experience designers they'd brought in from an outside agency, who deemed feature after feature irrelevant for mobile users, arguing that so many options would just confuse them.
By the time the designers were done, they had whittled the list of features down by 80%. "Their message was simple," says Masiero. "Less is more." In a mobile application, it is better to cleanly provide the 20 most important pieces of information than force people to navigate through 100 that they might never use. "We learned that you have to drop completeness in the name of usefulness."
What's more, Masiero, like a lot of other tech leaders, realized that in this age of mobility and user-driven technology, IT shops that don't have a user experience expert onboard need to get serious about begging, borrowing or stealing one -- an increasingly difficult proposition.
Developers with user interface (UI) and user experience (UX) expertise are hot these days, according to Shane Bernstein, managing director of QConnects, a Culver City, Calif.-based digital recruitment firm. And it's a fairly recent phenomenon, he says. Between 2010 and 2011, QConnects saw a 25% increase in the number of requests for UX designers; between 2011 and 2012, the increase was 70%.
Salaries are going up as well. Recruiters cite starting salaries ranging from $70,000 to $110,000, with the upper end hitting $150,000 and up. The Creative Group, a division of Robert Half Technology that specializes in design, marketing and interactive talent, began tracking UX designers separately in its annual salary survey in 2011. Salaries went up 6.2% in 2012 and it expects another 4.8% increase in 2013.
"And be prepared for a local variance factor," says Donna Farrugia, executive director of The Creative Group. "If you live between San Francisco and San Jose, add 30%."
Thanks to Apple, users expect perfection
In design parlance, the user interface (UI) is what the user sees; the user experience (UX) is how the application behaves. Both recruiters and practitioners stress that designers need to know the latter as much as the former. That is, they need to concentrate not only on how a design looks, but on the whole "wireframe" of the application, and where their requests are going into the back-end of the system.
What's driving the demand for such skills? Many people in the industry lay the credit -- or perhaps blame -- on Apple, with its near-fetishistic attention to how design, hardware and interface intersect. "Now people expect everything they interface with to have the ease of use of the iPhone," says Matt Miller, CTO of Irvine, Calif.-based technical recruiting firm CyberCoders.
"Apple forces everybody to match their aesthetic," agrees Masiero. "The image of your brand is at stake in your mobile application now. Companies that have great design, whether they're a restaurant chain or a car manufacturer, have a more valuable brand," and the same standards apply internally, he says.
Moreover, as mobile computing explodes, a company's client base becomes both broader and more demanding of a consumer-like product experience. As Masiero notes, 10 years ago his company's sole target audience was the human resources department. That's no longer true.
"With mobile devices becoming ubiquitous, we have to serve 30 million users, from somebody on a construction site to an airline pilot to a hotel manager. And you have to create a design so that the experience is accessible to everyone, while still providing them with a sense of uniqueness," he says.
High tech, high touch
With design at the forefront of everyone's mind, UX experts are suddenly in high demand and short supply. One reason they're hard to find is that the position spans multiple disciplines: design, programming and human behavior. "When you find that person, let me know," jokes Masiero.
"We do a little bit of market research, a little bit of psychology. We're synthesizers, pulling bits and pieces of different methodologies together," says UX designer Whitney Quesenberry, who runs her own agency in High Bridge, N.J. and has done work for Novartis, Siemens, Dow Jones and Eli Lilly among other companies. "UX is like programming -- there's not just one job involved."
Why UX designers love their jobs
The job description is amorphous and challenging -- to understand a given app's interface requirements, user experience context and back-end machinations. But the pay is mighty attractive -- between $70,000 and $110,000 to start, recruiters say -- and the perks associated with a UX (user experience) position sound like the halcyon days of the Internet boom: stock options, signing bonuses, flexible work hours.
One recruiter reported seeing one company offering liquor in its vending machines, and one employer offered designers unlimited time off (in return for results, of course).
And UX designers themselves say there are other, intangible benefits to the position. "Money only takes you so far," says Michael Beasley, a designer for Internet marketing agency Pure Visibility in Ann Arbor, Mich. "The work has to be interesting, not the same things over and over again. I like having fresh problems to tackle and the feeling that I'm making a difference for our clients."
Whitney Quesenberry, a UX designer who runs her own agency in High Bridge, N.J., says, "The real perk is meaningful work. Why would anybody want to work on something where you spend the first six months writing about requirements and the next six arguing about them?"
Quesenberry's advice for becoming a highly prized designer with both technical depth and design breadth? Check out one of the multiple masters' programs, such as the one at the University of Michigan, aimed at people already in the workforce, or talk your way onto one of the hybrid design teams that are becoming more prevalent within IT departments and learn all you can.
The Creative Group's Farrugia insists that the more cross-disciplined a designer is, the better, with the ability to combine good design and layout background with technology skills encompassing HTML coding and JavaScript. "The ideal is this hybrid person who's both right-brained and left-brained, high tech and high touch."
That pretty closely describes Michael Beasley, a designer for Internet marketing agency Pure Visibility in Ann Arbor, Mich. He got a BA in both English and music from the University of Michigan, and then stayed to get his masters' degree in Human-Computer Interaction from its School of Information in 2005.
"That's where I got my approach to interface design," Beasley says. "The multidisciplinary approach taught me design, human cognition and usability principles and methods. I also got a good understanding of how organizations work and information flows. That made me a pretty well-rounded person."
That kind of background sits well with IT managers like Masiero, for whom good design goes deeper than rounded corners on icons. "I want you to be a wizard of understanding the mental model of the user and translating that into the behavior of the application. You have to always think about making the user comfortable, about not creating any friction between what the user expects to happen and what the application expects from the user."
"Designers who understand human interaction are one step ahead of everyone else," agrees Farrugia. "They are rare and precious commodities."
Grow your own UX team?
With so much in the business world dependent on the success of mobile applications these days, most companies feel they can't forego development until colleges or vocational schools churn out more graduates with the ideal mix of design and coding sensibilities.
In the meantime, they cope by forming multidisciplinary teams to stand in for one perfect UX expert. "A designer might not be able to program, but they should be able to have a reasonable conversation with a programmer so they understand the impact of a design decision," says Quesenberry.
Farrugia has seen these hybrid design teams form more frequently over the past few years. "We've been coaching people in the design world to learn interaction and Web and digital skills, so they've been adding to their portfolio. Vice versa, people on the technical side are interacting more frequently with the front-end team to understand usability, personas and usage scenarios."


Read more »

EU Commissioner Kroes won't be bullied on net neutrality, says spokesman


Europe's digital agenda commissioner will not succumb to pressure on the issue of net neutrality, her spokesman said via Twitter on Friday.
In response to allegations by digital rights organization La Quadrature du Net that Commissioner Neelie Kroes had caved in to telecoms operators and was giving up on net neutrality, her spokesman Ryan Heath tweeted, "anyone thinking @NeelieKroesEU would let herself be bullied into diff opinions by any company/NGO, well, that just isn't her #NetNeutrality".
In her blog on Thursday, Kroes said consumers should be free to make their own choices about their Internet subscriptions, but that this "does not preclude consumers from subscribing to more differentiated, limited Internet offers, possibly for a lower price."
La Quadrature du Net interpreted this to mean that "Kroes supports the creation of a fragmented Internet, banning innovation and opening the door to unacceptable censorship."
"By deliberately ignoring that such offers would change almost nothing for operators in terms of cost, but would allow them to avoid investing in the development of network capacity while restraining possibilities for citizen participation, Neelie Kroes takes into account only short-term private interests that run contrary to public interest," the organization said in a press statement.
Kroes responded in an emailed statement, saying: "Make no mistake: I am in favor of an open Internet and maximum choice. That must be protected. But you don't need me or the E.U. telling you what sort of Internet services you must pay for."
The Commissioner had pointed out that opting for blocking ads or requesting privacy via do-not-track mechanisms "may mean you don't get access to content for free."
"The Internet does not run on its own. The network, content and Internet access all have to be paid for by someone. Many smaller web operators exist on the basis of innovative advertising models. There are various ways consumers pay for content, including by viewing advertisements before or during their access to content. Businesses should accept that different consumers will have different preferences, and design services accordingly," Kroes said.
Follow Jennifer on Twitter at @BrusselsGeek or email tips and comments to jennifer_baker@idg.com.

Read more »

Lands' End, software vendor at contract impasse after 20-year relationship


Lands' End is at legal loggerheads with its longtime payroll software vendor over how much longer the clothier can lawfully use the application, with US$1 million in potential fees hanging in the balance.
Lands' End signed a 20-year contract with Genesys Software Systems in January 1993, but the software didn't go live until Oct. 28 of that year, according to its complaint filed this week in U.S. District Court for the Western District of Wisconsin. Genesys was acquired by PeopleStrategy in 2010.
Now Lands' End is moving to another software vendor, and in August "reached out to Genesys to attempt to clarify and, if necessary, extend the license to cover the expected transition period."
Early discussions were "productive" but Genesys later broke off talks, according to the complaint. On Jan. 9, the two companies held a conference call during which Genesys officials said that Lands' End's license would be terminated on Jan. 19.
In addition, "reneging on previous offers, informed Lands' End that it only would extend the license if Lands' End paid Genesys approximately $1 million," according to the complaint.
That was apparently because Genesys' licensing policy had changed sometime during the companies' 20-year relationship.
"We no longer offer term licenses, but our perpetual license is currently priced at $999,950," wrote Colin Macdonald, director of finance at PeopleStrategy, in an email to Lands' End dated Jan. 9. Lands' End included the email in its court filing.
Genesys attorneys sent Lands' End a letter on Jan. 14, demanding that it uninstall the software and "send written certification of these activities" by the end of Jan. 19, the complaint adds. Genesys also "threatened to seek a temporary restraining order and all fees in connection with such action if Lands' End does not comply with these demands."
Lands' End, however, is maintaining that it has the right to keep using Genesys' software until Oct. 28 of this year, which would be 20 years after the go-live date, not the effective date of the agreement. "The License Agreement provides for twenty years of 'use' of the Genesys Software," the complaint states.
"For Lands' End to prematurely cease using the Genesys software and to expedite its transition to a new software vendor would cause Lands' End considerable and unnecessary damage and expense," it adds. Lands' End is asking the court for a declaratory judgment stating that it has the right to continue using the software and that the license term doesn't expire until Oct. 28.
PeopleStrategy didn't immediately respond to a request for comment Friday. As of Friday, it had not filed a response to Lands' End's complaint.
"I would love to hear the story from the Genesys side," said analyst Frank Scavo, president of the consulting firm Strativa. "Unless there are mitigating factors not mentioned in this lawsuit, it would appear that Genesys has a gun pointed to the head of Lands' End. If this was standard language in Genesys' contract 20 years ago, I have to wonder if other customers have run up against this problem and how they have resolved it."
Whatever the outcome of the case, there's a lesson to be learned for all software customers, according to Scavo.
"Twenty years may seem like a long time when you are signing a software agreement, but when you are signing a software agreement, you need to assume you will reach the end of any license period," he said.
"I would advise buyers to negotiate a perpetual license agreement whenever possible," Scavo added. "I would advise buyers to avoid limited term licenses to avoid situations like the one that Lands' End now appears to be in. If the vendor insists on a limited license period, buyers should at least negotiate the terms of extending the license agreement beyond the termination period."
Chris Kanaracus covers enterprise software and general technology breaking news for The IDG News Service. Chris' email address is Chris_Kanaracus@idg.com

Read more »

CA upgrades workload automation software


CA Technologies has released the latest update to its Workload Automation software (WLA) featuring more powerful analytics tools, a streamlined user interface and expanded functionality for managing business processes throughout the enterprise.
CA competes with IBM, BMC and others on workload automation. The newest release expands support for SQL Server and has enhancements to the variety of job types that can be managed with the system. New reporting capabilities allow IT managers to centrally track business processes, too.
"WLA is one of our largest product lines because of its ability to orchestrate business services in production environments to deliver value across the spectrum of IT services," says Mark Combs, distinguished senior vice president for CA's mainframe business.
Torsten Volk, a senior analyst at Enterprise Management Associates, says CA, as well as other competitors, are increasingly attempting to expand workload automation to be more tightly integrated with business requirements. The hope is to create systems management solutions that can automate and orchestrate entire business processes. Workload automation tools integrate with other enterprise software, such resource planning (ERP), customer relationship management (CRM) and content management systems (CMS). "Workload automation ties all of those things together to ensure they all deliver the right information to the right place at the right time," Volk explains.
Take an example such as a supermarket chain: A cash register system may be connected to an inventory management system, which would be connected to a product ordering system to ensure the shelves are appropriately stocked.
As organizations continue to look to cloud computing models, this process becomes more complex, and the business process awareness of tools such as CA's WLA becomes essential. "Automation and orchestration is the backbone of cloud," Volk says.
CA has competition in the WLA market, though. Volk says IT shops that have existing relations with CA are most likely to stick with CA, and the company has made that easy to do. Graceful upgrades allow IT managers to pick and choose which portions of the WLA package are upgraded at any given time, so previous versions of WLA will work with the new one.
In addition to the SQL Server compatibility, the newest release also includes other new features such as a remote execution agent, which allows up to six remote systems to be handled under a single license. Improvements were also made to further support Oracle and PeopleSoft.
Network World staff writer Brandon Butler covers cloud computing and social collaboration. He can be reached at BButler@nww.com and found on Twitter at @BButlerNWW.
Read more about software in Network World's Software section.

Read more »

Jobs' house burglar gets seven-year sentence


The man who broke into the Palo Alto, California, home of late Apple CEO Steve Jobs and stole laptops, iPads and other possessions has been sentenced to seven years in a California state prison.
Kariem McFarlin, 35, was arrested in August last year by officers from the Rapid Enforcement Allied Computer Team, a Silicon Valley-based high-tech crime unit formed by local, state and federal law enforcement agencies.
REACT officers found McFarlin with help from Apple security, which tracked where the stolen devices were being used by matching their serial numbers with connections to Apple iTunes servers. The IP address in use matched a line in McFarlin's apartment in nearby Alameda that was also being used by an Apple device registered to a member of his family, according to a police report.
The burglary happened between the evening of July 16 and morning of July 17 last year while renovation work was being carried out on the Jobs house, which is now occupied by Jobs' wife.
McFarlin jumped over a construction fence and entered the house through its garage. Once inside, he stole two iMacs, three iPads, three iPods, one Apple TV box, a diamond necklace and earrings, and several other items.
McFarlin admitted to the burglary under questioning by Palo Alto police and said he had stolen from other homes in the San Francisco Bay Area, including two homes in Marin County, four homes in San Francisco County and one home in Alameda County.
He admitted keeping hundreds of thousands of dollars' worth of property from those burglaries at his home and at a storage locker. The property included computers, jewelry, furniture and a solid silver bar, according to the Santa Clara District Attorney's office.
At the time of his arrest, he apologized for his crimes and said he had taken to crime because he had money problems and was desperate.
He didn't dispute the charges in court. He was also ordered to pay restitution to the victims of his crimes.
Martyn Williams covers mobile telecoms, Silicon Valley and general technology breaking news for The IDG News Service. Follow Martyn on Twitter at @martyn_williams. Martyn's e-mail address ismartyn_williams@idg.com

Read more »

5 PC industry omens hidden in Intel's financial statements


Multiply the joy of watching paint dry by the sheer pleasure of watching grass grow, and you'll get a decent idea of how exciting it is to parse the average corporate earnings report.
But everything changes when those numbers come from Intel. Don't get me wrong: Intel's Thursday afternoon earning's call was still soul-suckingly boring. But as one of the cornerstones of the old Wintel homogeny, Intel's yearly results and estimates serve as an unofficial barometer for the PC industry as a whole. As Intel goes, so goes the entire desktop ecosystem, and hidden deep in the company's newly released financial statements are five portents for the PC industry of 2013--and beyond.
1. The PC is not dead
This one's easy: Next time a pundit tells you the PC is going the way of the dodo, tell him to stuff it. Sure, general PC sales were down slightly in 2012--3 percent in the case of Intel's PC Clients Group, and an estimated 3 to 5 percent for the industry overall--but desktops and laptops still do tremendous business.
"If you're looking at 350 million units (shipped in 2012), that is not a dead market," says Patrick Moorhead, founder and principal analyst at Moor Insights and Strategy. "The PC industry may be slowing, but it's certainly not dead."
As a whole, Intel managed to snag $53.3 billion--yes, billion with a "B"--in revenue in 2012. That's more than $1 billion week in and week out. Oh, and while Intel's PC revenues were down 3 percent in 2012, actual unit volume was only down 1 percent. Dead? Hah.
2. ...but the focus is changing
No, the consumer PC hasn't given up the ghost, but its days of epic growth have definitely stalled. Intel only expects sales revenues to increase in the single digits in 2013 after the down year of 2012.
Intel sees the writing on the wall and is working hard to diversify its lineup to match industry trends--starting, of course, with those pesky tablets. During his intro to Intel's earnings conference, CFO Stacy Smith spent as much time waxing poetic about the company's 2013 mobile initiatives as he did talking up Ultrabooks and desktop processors. That itself follows a new trend for the company: At CES, Intel's Bay Trail tablet processors and Lexington smartphone processors enjoyed just as much limelight as the upcoming Haswell CPUs.
The company is also placing a bigger focus on business customers, and, yes, the now-ubiquitous cloud. Intel's server-focused Data Center Group was the only division that saw revenues increase in 2012, and Intel expects DCG's revenues to grow in double digits this year. The new focus on servers and mobile technology give Intel a unique chance to double-dip the market.
"Data center and cloud are Intel," Moorhead says. "Their big boosts in sales were driven by that. Every handset and tablet that gets sold connects to the cloud, and Intel is providing the cloud. People forget that their hardware is driving the cloud right now."
ARM, the 800-pound gorilla in the mobile world, is also turning its attention to the cloud, with 64-bit ARM processors expected to hit server racks in 2014. In fact, the entire PC industry has shifted a lot of focus to the business arena, emphasized by Dell and HP's attempted reinventions as enterprise-focused companies.
3. Blurring lines and blending uses
The future, to hear Intel CEO Paul Otellini tell it to investors, lies in hybrids. (That shouldn't come as a surprise if you've been paying attention thus far.)
The first round of Windows 8 hybrids hasn't exactly taken the world by storm, but Otellini expects mobile technology to split into two distinct camps going forward: tablets and phablets in the 5- to 7-inch range, and larger 10-inch-plus offerings. Otellini expects those larger hybrids to offer PC-like performance in a slim, tablet-like form factor thanks to power improvements found in the Haswell and Broadwell processors slated to launch over the next two years. Patrick Moorhead agrees.
"All points converge on 2014," Moorhead says. "In 2014, you'll be able to have a very high performance, 9mm thin, fanless, low-cost tablet based on Haswell technology. At 10 inches and above, you'll be able to slide it right into a keyboard dock. Why on earth would you buy a separate tablet? Because you're not compromising as a notebook, and you're not compromising as a tablet, there really won't be a market for stand-alone 10-inch tablets." That doesn't sound good for Windows RT--or for notebooks, really.
Intel's already laying the base for the flipping, sliding, and oh-so-versatile future of laptops. At CES, the company announced that any laptops powered by Haswell processors will need to sport a touchscreen in order to also sport the Ultrabook name.
4. Racing towards the top
One thing about hybrids, though: They're more expensive than their less-flexible counterparts. Despite recent howls for cheap touchscreen notebooks to boost Windows 8 sales, we're more likely to see manufacturers futzing around in the high end rather than duking it out for low-cost supremacy.
Sales of touch-enabled Windows 8 models in the fourth quarter have convinced Otellini that "people are willing to spend a little bit more to get a more capable product. That's certainly been true in the Apple model for many, many years, and I think there is a model of getting paid for innovation." Get ready to whip out your checkbooks, folks.
NPD data from the holiday season found that the average selling price of an Apple laptop was $1,419--exactly $999 more than the $420 selling price of the average Windows notebook. Meanwhile, sales of Windows notebooks under $500 dropped 16 percent, while sales of $500-plus laptops grew by 4 percent. OEMs aren't dumb. They want in on that gravy train, and we've already seen manufacturers likeDell and Acer dump low-end products to focus on Ultrabooks and other products with higher margins, though that didn't exactly pay off in 2012's down economy.
But fear not, budget-minded PC fans: Cheap laptops aren't quite ready to doddle off into the sunset quite yet. "Intel's not saying they won't participate in the low-end market. In fact, they have parts like Atom and Pentium that prove that they will," Moorhead says. "What they're saying is that they're going to put their focus into new usage models that require higher performance and drive even better experiences."
Those better experiences, Moorhead says, will culminate in Intel's " Perceptual computing" initiative, which blends computer control with human senses. Speaking of innovation...
5. Racing towards the top, part II: Moore continues laying down the law
Consumer PC sales may be slowing, but Intel's focus on creating smaller, better, more efficient processors hasn't wavered. The company is still building towards a bright PC future, spending a whopping $18.2 billion--again, that's billion with a "B"--on R&D and acquisitions last year. That number's expected to jump to $18.9 billion in 2013.
Intel isn't just funding killer company parties with all that cash, either. The company plans to start production on the 14nm chip-making manufacturing process in 2013. "This puts us significantly ahead of the competition," CFO Smith said during the earnings conference. Intel's current Ivy Bridge chips are built using a 22nm manufacturing process, while AMD's processors have been stuck at 28nm.
Expect 14nm Haswell chips to start showing up in 2014, but that's not all Intel has up its sleeve. In 2013, the company also plans to start initial work on the 10nm manufacturing process, the 2016 die-shrink "tick" following the new Skylake "tock" architecture planned for 2015.
But while Intel's chips are getting smaller and smaller, the company's working hard to increase the size of the silicon wafers those chips are cut from. Current wafers measure 300mm, and Intel wants to move to 450mm. Bigger wafers mean lower production costs, which might--just maybe--result in lower CPU prices in the future. Although the move to larger wafers isn't expected to really start ramping up until the latter half of the decade, Intel has already begun investing in the transition process. Yep, Chipzilla thinks long-term.
A key 2012 investment could pay off for both of those initiatives. In July, Intel gave ASML Holdings $3.3 billion to stimulate the development of both 450mm wafers and extreme ultraviolet lithography, a next-gen technology candidate that could help Intel make chips using ever-smaller CMOS manufacturing processes. Intel expects the immersion lithography process used to make current day chips to become ineffective in sub-10nm chips.
Intel has said it doesn't expect the EUVL technology or 450mm wafers to be ready for the 2016 roll-out of 10nm "Skymont" chips, but Otellini declined to provide an update about 10nm chips and the possible use of EUVL when asked about it during the earnings conference.
To infinity, and beyond!
Sure, Intel's operating income may be down a bit in 2012, but taken overall, the company's earnings point to a vibrant, cutting-edge future for PCs, and hundreds of millions of PCs at that. That future might look different than the present we recognize, with more blurring of the lines and segmented niches, but the PC's outlook has never looked brighter--or more utterly transformative.



Read more »

Saturday, December 29, 2012

After claiming Verizon attack, hacker and the spoils disappear


Hours after boasting about the theft of 3 million records from Verizon Wireless, the hacker claiming responsibility for the attack and the purloined data posted to Pastebin have disappeared from the Web.
A search for the hacker's Twitter handle, @TibitXimer, produced a "Sorry, that page doesn't exist!" message.
Meanwhile, the data claimed to belong to Verizon Wireless appears to have been removed from Pastebin, a popular site for hackers to post stolen data.
Verizon Wireless is denying that the file that was posted to the Internet contained information from its customers. "We have examined the posted data and we have confirmed that it is not Verizon Wireless customer data," Verizon spokesperson Alberto Canal told ZDNet. "Our systems have not been hacked."
The hacker later revised his story about the origin of the data, telling ZDNet the data was from Verizon FiOS files, not Verizon Wireless.
Old hack?
Security researcher Adam Caudill, who viewed the data before it disappeared from Pastebin, wrote on Twitter that the information was posted months ago to the Internet. "The file that's going around is one of the files that we discussed back in August," he tweeted. "Nothing new."
"It's part of a set of files that was posted in August; I strongly suspect it's a telemarketing file or similar," he added.
ZDNet broke the theft story on Saturday, reporting that a hacker had posted 300,000 database entries belonging to Verizon Wireless.
The hacker told ZDNet that he'd breached the Verizon database on July 12 and downloaded an estimated 3 million records containing names, addresses, mobile serial numbers, the opening date of each account, and account passwords.
The hacker added that he decided to post a portion of the pilfered information to Pastebin because Verizon had not fixed the vulnerability since the hacker had exploited it.
Although sympathetic with the hacktivist collective Anonymous, the Verizon hacker told ZDNet he had no affiliation with that organization.
Verizon spokesman Canal confirmed to ZDNet that a breach had taken place months ago and had been reported to law enforcement authorities.
Many of the details about the incident claimed by the hacker were incorrect or exaggerated, he added. All customers affected by the incident were notified at the time, and safeguards were taken to protect their data and privacy.
Twitter tries to tame boasts
It's believed that Twitter suspended the hacker's account after learning about his claims.
Twitter has been fighting its "dark side" for years with mixed success.
It has also attempted to add more transparency to enforcement actions it takes on members' accounts. For example, Twitter launched a new policy in November calling for takedown messages to be posted to a member's tweet feed when one of their tweets had been removed for an alleged copyright violation.
Before the policy change, such tweets just disappeared from a feed stream without explanation, making it more difficult for whomever posted the tweet to challenge the takedown.

Read more »

Toshiba to launch 20-megapixel image chip for digital cameras


Toshiba is preparing a 20-megapixel image sensor for digital cameras that it says will be the highest resolution of its kind.
The Tokyo-based firm said the new chips will be able to support capturing 30 frames per second at full resolution. They will also be able to shoot video at 60 frames per second at 1080P or 100 frames at 720P.
Toshiba said it will begin shipping samples of the new CMOS chips from next month, with mass production to begin in August of 300,000 units monthly. Toshiba is best known in components for its NAND flash memory, which it develops with partner SanDisk, but is also a major manufacturer of LSI and other semiconductors.
Digital point-and-shoot cameras are steadily falling in price, squeezed between brutal competition among manufacturers and the increasing threat of smartphones and mobile devices. While the number of pixels a camera can capture is not always a direct measure of the overall quality of its images, it is a key selling point to consumers.
The image resolution of top-end smartphones now often meets or exceed that of digital cameras. The Nokia 808 PureView launched earlier this year has a 41-megapixel image sensor.
The Japanese manufacturer said it has increased the amount of information pixels in the new chip can store compared to its previous generation of CMOS, producing better overall images. It has also reduces the size of pixels - the new 20-megapixel version has individual pixels that measure 1.2 micrometers, down from 1.34 micrometers in its 16-megapixel product.
CMOS, or complementary metal-oxide semiconductor, sensors contain rows of electronic pixels that convert light into digital signals, as well as on-chip processing technology that can enhance images or speed transfers.
Toshiba says its goal is to achieve a 30 percent market share in CMOS sensors for digital cameras in the fiscal that ends in March of 2016.

Read more »

Saturday, December 1, 2012

OLPC cancels XO-3 tablet, downplays need for new hardware


One Laptop Per Child has cancelled plans to release its XO-3 tablet, although technology from that project could still be used in other products, OLPC Chairman Nicholas Negroponte said.
"The XO-3 is by no means gone. It may emerge in its constituent parts rather than as a whole," Negroponte said via email.
OLPC started off in 2005 as a laptop project and is well known as a hardware innovator, with its first XO-1 laptop being praised for its unique and environmentally friendly design. The XO-3 cancellation comes as OLPC officials say the organization could de-emphasize the focus on hardware design in the long run in favor of education projects.
The nonprofit group announced plans for the XO-3 tablet in 2009 and showed early samples at CES earlier this year. The tablet was supposed to ship earlier this year for US$100, but it was delayed while OLPC finalized the design and sought partners to manufacture the XO-3. The tablet was meant to be a low-cost computing tool for students in developing countries.
The XO-3 was originally priced at $75 and that triggered a backlash, in part because critics said the price was unrealistic. OLPC didn't plan to have the product manufactured itself, as it did with the XO-1 laptop, which too was delayed and eventually shipped at double its promised $100 price tag.
The XO-3 design is still available, and it is more likely that companies use some of the tablet's key technologies, such as flexible power input and charging efficiency, said Ed McNierney, the chief technology officer at OLPC.
"There's a lot of decent tablet technology out there -- it's really a question of putting things together in the right package for the children we're trying to serve," McNierney said in an email. "The Nexus 7 is nice, too, and a more kid-friendly size, and there are other good examples."
The tablet shown at CES had a rugged body, an 8-inch screen and included optional technologies such as a solar charger and support for satellite Internet. It used a display from Pixel Qi that conserves battery life by using ambient light to brighten the screen.
OLPC's priority has always been education and the need to design its own complete hardware systems "may go away," Negroponte said. Tablets are an important learning tool for children, but companies may be able to ruggedize existing low-cost products for use in schools, he said.
"We had to build the [XO-1] laptop, but we do not have to build the tablet," Negroponte said, adding that, "the need for OLPC may morph into something else."
OLPC also designed a hybrid laptop-tablet called the XO-4 Touch, which includes some of the XO-3's features. That product is still scheduled to ship early next year. The XO-4 resembles the original XO-1 laptop but has a touchscreen that can swivel around and fold over the keyboard to make an e-reader.
As an alternative to the XO-3, Negroponte is not opposed to buying low-cost tablets and distributing them to schools. Tablets from companies such as Motorola, which have been deployed as an educational tool in developing countries, have shown good power management and no breakage in rugged environments.
"I am surprised how good they are, as they were not designed for [the] environment," Negroponte said.
Experiments have shown that tablets have made basic learning and computing easier, he said.
"The amazing result is that the kids are showing all the precursors of reading," Negroponte said.
OLPC will continue with hardware design on the XO-4 and beyond for the simple reason that there are now nearly 3 million XO devices around the world, McNierney said.
"That means two things: ongoing support for the existing customers, and ongoing engineering to keep the design current. Existing customers need additional units, spare parts, etc. and that need won't go away," McNierney said.
Components also must be refreshed every 18 to 24 months to keep using readily available parts and to keep the price down.
"That doesn't mean, of course, that OLPC needs to be the organization to do those things in the long run. That's the nice part of being a nonprofit; we do things -- like design hardware -- when no one else is stepping up to do them. If someone else can do them, we can stop," McNierney said.
Agam Shah covers PCs, tablets, servers, chips and semiconductors for IDG News Service. Follow Agam on Twitter at @agamsh. Agam's e-mail address is agam_shah@idg.com

Read more »

DOE wants 5X battery boost in 5 years


The U.S. Dept. of Energy has set a goal to develop battery and energy storage technologies that are five times more powerful and five times cheaper than today's within five years.
To accomplish this, U.S. Energy Secretary Stephen Chu is taking some lesson from U.S. history.
The DOE is creating a new Joint Center for Energy Storage Research, at a cost of $120 million over five years, that's intended to reproduce development environments that were successfully used by Bell Laboratories in the World War II Manhattan Project that produced an atomic bomb.
"When you had to deliver the goods very, very quickly, you needed to put the best scientists next to the best engineers across disciplines to get very focused," said Chu at a press conference Friday that was streamed live from Argonne National Laboratory in Illinois. The center will be located there.
The Battery and Energy Storage Hub project will involve six national labs, five universities -- Northwestern University, University of Chicago, University of Illinois-Chicago, University of Illinois-Urbana Champaign, and University of Michigan -- and four private firms, Dow Chemical, Applied Materials, Johnson Controls, and Clean Energy Trust.
While physical proximity will have a role in the research, Chu said electronic communications and video conferencing will help achieve similar results.
The intent is to organize research in a way that can "change the rate in which something is actually done," said Chu. The key is moving technology innovations from the lab to the private sector as quickly as possible, he said.
Improving battery technology is seen as pivotal to transportation and storage, particularly around the need to store solar and wind energy.
Chu said the idea of seeking a 5X improvement is really around getting the battery and energy storage prices to a point where they will gain widespread adoption.
"We look very carefully at the price points," said Chu, who cited the impact of falling prices on cell phones of PCs, as examples of how low prices trigger mass adoption.
Chu said the effort it's "very, very important for American industrial competitiveness that research be intimately linked with manufacturing in a way that will propel the United States forward. This is what the whole Hub concept is about."
The intent isn't to aim for incremental improvements of existing technology, but to seek new approaches and "rapidly drive towards electrochemical energy storage solutions beyond the current limits," according to DOE's proposal.
DOE, in its solicitation for proposals, said current battery research is typically focused on one particular problem "and thus lacks the resources and the diverse breadth of talent to consider holistic solutions."
The goal of the Hub is to create a "critical mass for the best, most innovative and far-reaching ideas."
"Based on new understanding, the Hub should foster new energy storage designs that begin with a 'clean sheet of paper' -- overcoming current manufacturing limitations through innovation to reduce complexity and cost," said DOE.
Chu personalized the results.
When his home lost power recently, Chu said he did some calculations and said with battery storage improvements someone could halve the number of solar panels on their roof, "you can be 80% self-sufficient and blackout immune," he said. If prices were less than $10,000 for that system, "I would get that," he said.
Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov, or subscribe to Patrick's RSS feed . His e-mail address is pthibodeau@computerworld.com.
Read more about emerging technologies in Computerworld's Emerging Technologies Topic Center.

Read more »

Monday, November 19, 2012

Why Your Data Center TCO and Location Matter


Selecting a good location for your data center is a critical element of data center planning, as deciding where to build and maintain a facility directly impacts the total cost of ownership (TCO) over the lifetime of the data center.
In looking to either purchase an existing facility or build a new data center, there's an exhaustive list of factors to be weighed and analyzed before you select a site. To that end, CIO.com spoke to industry leaders to learn more about considerations that range from the probability of basing your data center in an area where a natural disaster could occur to the availability of utilities and the cost of energy.
Key Data Center Location Considerations: Expenses, Expenses and Expenses
According to data center solutions provider Lee Technologies, a subsidiary of Schneider Electric, one basic mistake organizations make is failing to take TCO into account. In its report, The Top 9 Mistakes in Data Center Planning: The Total Cost of Ownership Approach , Lee Technologies recommends that the best approach to is to focus on three basic TCO parameters: capital expenses, operations and maintenance expenses, and energy costs.
News: New Data Center Design Boasts "World's Most Efficient Cooling System"
Keith Lambert, senior vice president of design, build and construction for Lee Technologies, says the company looks at potential sites for clients looking to build new or retrofit an existing structure. Depending on the organization's needs, there's a number of different ways to approach site selection.
The TechVault data center in Vermont deploys Schneider Electric equipment. (Image credit: Tom Way Photography)
"We're mainly interested in site selection if there are tax incentives in the area-and only if they are true incentives," Lambert says. Many communities, especially those in rural areas, offer incentives aimed at attracting investment and creating jobs in construction as well as information technology.
Utility costs matter as well, Lambert adds. "For example, we want to know the cost of water per gallon, the electricity rate and also the cost of [sewer] discharge."
There's also real estate, infrastructure, materials and labor. At the moment, says David Eichorn, data center practice head for Akibia, which offers services to improve the availability, reliability and performance of data centers, Oregon is a popular data center location. That's due to a combination of a highly skilled labor force, favorable climate and lower cost of living than, say, neighboring California. (For many of the same reasons, Canada is also attracting the attention of firms looking to build a data center.)
Finally, during the site selection process, it's critical to examine network connectivity in the area and find out how close to the facility it runs. Depending on the complexity of the site and redundancy levels, the availability of multiple power sources may be a key factor for some companies.
Don't Let Energy Costs Overwhelm Your Data Center
Power doesn't come cheap. Rob Woolley, senior vice president of critical environment services for Lee Technologies, says energy costs-and the types of deals you can get from various providers in the area-have become increasingly important over the past 10 years.
"The cost of energy and availability of utilities...is at the top of everyone's list of selection criteria," Woolley says, adding that green initiatives such as free cooling can have "a major impact on savings."
Make no mistake, utility costs can be the deal breaker when choosing a site. For Lee Technologies, energy costs is also a leading indicator to get a client to build in a specific area-especially if there's hydroelectric power or another source of energy in the area that drives operational costs down. "Hydro is great power. Not only is it relatively inexpensive compared to other sources, but it's also very clean. There's very little carbon associated with hydroelectric power," Woolley says.
Akibia's Eichorn, for his part, says that green IT is one of the biggest changes in the industry. "In the past couple years, it's a positive trend that has taken hold in the data center industry."
Analysis: Are CFOs Turning Greener?
Eichorn agrees that power has become an increasingly a cost driver in data centers. At the same time,green initiatives have people talking how to better manage power consumption. As a result, he says, there are many new techniques available to companies today.
"Companies use green, and they use it for different reasons," Eichorn says. "There is an emphasis on being environmentally conscious, but there's also the [monetary] value...that being green brings to the table."
Backup Data Center Shouldn't Be Too Close-Or Too Far
Most companies don't plan for just one data center at a time, Eichorn notes. Usually, it's two: a primary facility and a business continuity and disaster recovery redundant facility.
David Eichorn, data center practice head for Akibia, believes today's data center industry provides accessibility to green initiatives for everyone.
One of the biggest concerns, Eichorn says, is the proximity of the two data centers. Putting one facility in an area that's prone to natural disasters is risky enough, but if your data centers are too close, a hurricane, tornado or other big storm could take out both facilities, he says.
How-To: 6 Energy-Efficient Data Center Practices
At the same time, if the data centers are too far apart, turn-around time suffers. In addition, putting facilities in another state, province or territory will increase the overall cost (and complexity) of capital and labor, which is an important consideration for small and medium-sized companies. (Larger firms with multiple offices, of course, have more options for where to base a data center.)
In the end, Lambert says, most businesses establish a perimeter or short-list of ideal regions and go from there. If you narrow your site selection down to a few regions, then you can analyze the benefits of each location and plan your facility.
The good news for companies looking to expand facilities or invest in new data center builds is that the industry is more open today than it was in the past. Far from being proprietary and secret, today's data centers show that there is more interest and more willingness from companies to share their experiences and knowledge with other companies.
"When people bring knowledge and technology advancements to the table, it brings forth a more open environment and accessibility to green initiatives for everyone," Eichorn says.
Based in Nova Scotia, Canada, Vangie Beal has been covering small business, electronic commerce and Internet technology for more than a decade. You can tweet with her online @AuroraGG. Follow everything from CIO.com on Twitter @CIOonline, on Facebook, and on Google +.
Read more about disaster recovery in CIO's Disaster Recovery Drilldown.


Read more »