Friday, 2 November 2012

Microsoft to tie Yammer, Skype to its CRM application

Microsoft is preparing to release an update to its Dynamics CRM Online software that will feature a new user experience as well as tie-ins to its Yammer social networking software and Skype communication platform, the company announced Thursday.

In addition, Microsoft is planning to release a mobile CRM application in mid-2013 that will run on Windows 8 and iPad devices, said Seth Patton, senior director of product marketing.

[ Discover what's new in business applications with InfoWorld's Technology: Applications newsletter. | Get the latest insight on the tech news that matters from InfoWorld's Tech Watch blog. ]

The CRM Online update is coming in December and will feature new "role-specific" user experiences incorporating prebuilt processes that "guide sales and service professionals through pre-defined lead, opportunity and case management processes," Microsoft said in a statement.

But the interfaces are completely "opt-in," with Microsoft remaining committed to users who prefer to use the Outlook interface, Patton said. They are also "fully configurable" to a user's personal preferences, he said.

Meanwhile, the Yammer integration represents "the first phase of integrating Yammer as a social layer" within Dynamics CRM, Patton said. Users will be able to post messages from Dynamics CRM to Yammer, and vice versa.

Rival CRM vendor Salesforce.com has included the Chatter social software within its CRM system for a couple of years now, but Microsoft isn't necessarily playing catch-up, according to Patton. "We're looking to take advantage of the fact that [Yammer] has broad viral adoption," and Microsoft will make it "super-seamless" to use Yammer with CRM, he said.

Users will also be able to make Skype calls from within the CRM interface.

Other new features in the December update include support for the Firefox and Chrome browsers on Windows PCs, as well as Safari on Macs, Microsoft said.

Microsoft is also adding additional platform-level features, such as support for customized .NET workflow services and APIs (application programming interfaces) for bulk data loads.

Chris Kanaracus covers enterprise software and general technology breaking news for The IDG News Service. Chris' email address is Chris_Kanaracus@idg.com.


View the original article here

Amazon drops cloud prices, again

Amazon Web Services, fresh off an outage that brought down big-name sites such as Reddit and Imgur, today announced an 18 percent price reduction for its virtual machines, the 21st time the leading infrastructure as a service (IaaS) vendor has dropped prices since launching its cloud in 2006.

In addition to the price drop, AWS released a new series of Elastic Cloud Compute instances with high input/output qualities. They're optimized, AWS says, for media encoding, batch processing, caching and Web serving. The extra-large instance (m3.xlarge) comes with 15GB of memory and 13 ECU -- which are Amazon compute units -- across four virtual cores. A double extra-large instance has 30GB of memory with 26 ECUs on eight virtual cores. The service debuted in the Northern Virginia US-East region, but AWS plans to roll it out to other regions early next year.

[ Stay on top of the current state of the cloud with InfoWorld's special report, "Cloud computing in 2012." Download it today! | Also check out our "Private Cloud Deep Dive," our "Cloud Security Deep Dive," our "Cloud Storage Deep Dive," and our "Cloud Services Deep Dive." ]

MORE CLOUD: 12 free cloud storage options

IMPROVE YOUR SKILLS: 12 effective habits of indispensable IT pros

Small instances went from $0.08/hour to $0.065/hour, medium instances went from $0.16/hour to $0.13/hour, large instances went from $0.32/hour to $0.26/hour, and extra-large instances went from $0.64/hour to $0.52/hour

Dan Feld, who manages sales and business development at AWS consultancy Newvem and is a former AWS exec with the same role for Amazon in Europe, says the price drop for the small instances is a significant move for a lot of customers. According to a sample of 40,000 instances that Newvem monitors for customers, m1.small is the most popular type, making up more than 25 percent of instances. All of those customers will have a drop in prices. Along with the price drop, AWS is filling out its product portfolio with higher-end instance types as well, he says.

"Amazon sure doesn't want to make it easy for competitors," says cloud analyst Paul Burns of Neovise about the company's 21st price drop in six years. As processing performance continues to improve, Burns says AWS has taken the approach that it will pass those savings on to customers whenever it can, as opposed to waiting for some major breakthrough in performance or cost.

It's the same idea on the new instances. The high-end instance types follow a trend by AWS in recent years. AWS has rolled out high-performance compute options, cluster compute instance types, and now these new large and extra-large high-memory offerings.


View the original article here

ARM, Microsoft collaborating on 64-bit Windows version

ARM is working with Microsoft to tune the Windows OS to work on processors based on ARM's 64-bit architecture, an ARM official said this week.

Ian Forsyth, program manager at ARM, could not comment on a specific release date for the 64-bit version of Windows for ARM processors, but said ARM is continuously working with software partners to add 64-bit support.

[ Keep up on the day's tech news headlines with InfoWorld's Today's Headlines: Wrap Up newsletter. ]

"ARM works with all its OS and ecosystem partners to inform them on next generation technologies and enable their support," said Nandan Nayampally, head of ARM's processor marketing division, in an email statement. ARM's TechCon show is currently going on in Santa Clara, California.

Specific product support questions would need to be directed to the partners, Nayampally said. A Microsoft spokesman in an email declined to comment on specifics of the 64-bit version of Windows RT, saying it had no information to share at this time.

Microsoft last week released Windows RT, an OS that is 32-bit and works with ARM processors, and also released Windows 8, which works on x86 processors and is 64-bit. ARM this week announced its first 64-bit processor designs, Cortex-A57 and Cortex-A53, which are based on ARM's Armv8 architecture. The chip designer said that it expects servers and mobile devices based on the processors to reach the market in 2014.

Windows RT is on tablets with 32-bit processors from Nvidia and Qualcomm. Microsoft's Surface and Asus' Vivo Tab RT tablet have Nvidia's quad-core Tegra 3 processor, while Dell's XPS 10 and Samsung's P8510 Ativ Tab have Qualcomm's dual-core Snapdragon S4 processor.

The 32-bit Windows RT OS has a limited memory ceiling, and a 64-bit Windows RT OS would expand the memory capacity in tablets and PCs. A 64-bit version of Windows on ARM would also bring it on par with Windows 8.

Nvidia is developing a processor core based on ARM's 64-bit architecture under the code-name Project Denver. Nvidia declined to comment on development of 64-bit software for Windows.

A Qualcomm spokeswoman said the company cannot comment at this time on specific product plans. However, Qualcomm is an ARM partner and helps explore and evaluate emerging technologies including 64-bit software support, the spokeswoman said in an e-mail.

Microsoft's interest isn't surprising since the move to 64-bit seems like a natural progression for ARM and supporting vendors, just as it was for x86, said Charles King, principal analyst at Pund-IT.

But software would need to be written to support the 64-bit ARM instruction set, and porting many x86 64-bit applications is a challenge, King said. Existing applications that ran on previous versions of Windows do not run on RT.

"From a purely technical perspective, porting many common x86 applications to ARM is problematic," King said.

There are also questions on how developers will take the move from 32-bit to 64-bit, King said. But if customers want applications, the developers will deliver.

"These are some of the obvious challenges. Fortunately, everyone involved has a year or more to sort things out," King said.

Agam Shah covers PCs, tablets, servers, chips and semiconductors for IDG News Service. Follow Agam on Twitter at @agamsh. Agam's e-mail address is agam_shah@idg.com


View the original article here

Native mobile app dev vs. HTML5: Why not both?

Native mobile app dev vs. HTML 5: Why not both?

The ongoing debate over how to best get applications onto mobile devices -- either through native deployments or writing a mobile Web application -- is going to remain a front-burner question for developers to ponder, given the pros and cons of both, developers say.

Developers see benefits to both approaches, as well as to hybrid applications that mix the two. Some application builders are using dev tools like Appcelerator Titanium, which compiles Web-based mechanisms like JavaScript to native code. Web-based, or HTML5, development provides a quick way to get some applications to multiple devices, developers say. But native development, such as with Objective-C for Apple iOS and Java for Google's Android devices, offers access to the full breadth of a particular device's capabilities, which is often worth the cost of having to develop the code (though not the underlying logic) independently for separate platforms.

The native experience is second to none
"The Web and HTML5 have come a long way, but they have not gotten to the native experience -- the UI, the multitouch, what users expect from an application -- yet," says Jesse Newcomer, mobile development manager at Homes.com.

Freelance developer Ketan Majmudar finds problems with the offline nature of mobile Web applications compared to native applications -- applications either have to talk to an online Web service to pull down data or need a data store bundled with them. "HTML5 as a technology is not mature enough yet. It's nearly there, but there's a lot of hoops you have to jump though," such as with data downloading, he says. Native applications, meanwhile, can have data stored in a bundle when an app is downloaded. "The majority of your data is in place."

"Native development will never go away. Objective-C developers will always be required," Majmudar says. Adds developer Paul Nelson, a systems engineer and Web developer at logistics services company Morgan Supply on Demand: "I notice speed and the ability to control memory more when you do native." He says Facebook made a "huge mistake" in creating an HTML5 application for iOS (an effort that did not succeed). "They have the money and the resources to make a native app."

Plus, native development sometimes is just necessary to access certain features, such as the Siri voice-command capability in iOS, says Jonnie Spratley, director of product design at mobile experience provider Appmatrix. "There will always be a need just because of certain features," Spratley says.

HTML5 and hybrid approaches take hold
Although developers concur on the strengths of native development, they can't overlook the easy option of Web development or hybrid development. "It's a spectrum -- not a binary -- choice," says Kyle Simpson, a JavaScript architect at Getify Solutions. "The spectrum of how much native you embrace versus how much Web you embrace is very different, depending on the company."

Recompilation technologies like Adobe PhoneGap and Appcelelerator Titanium let developers leverage Web development efforts on mobile platforms, Simpson notes. But well-liked tools like Titanium arent perfect. "Titanium does have its quirks that you have to work through," such as to get UI pieces to work well, Homes.com's Newcomer says.


View the original article here

Beyond BlackBerry: Pentagon opens its door to iPhones, Android devices

In another blow to RIM's fortunes, the U.S. Department of Defense may be willing to consider smartphones other than BlackBerries if they can meet the government's tough security rules.

The DOD is inviting vendors to bid on software to secure non-RIM smartphones and tablets, according to a report by Reuters. The Defense Information Systems Agency (DISA) may award a contract in April 2013. The contract would cover 162,500 devices to start and ultimately reach 262,500.

[ Also on InfoWorld: U.S. government agency drops BlackBerry in favor of iPhone. | Get expert advice about planning and implementing your BYOD strategy with InfoWorld's 29-page "Mobile and BYOD Deep Dive" PDF special report. | Keep up on key mobile developments and insights with the Mobilize newsletter. ]

MORE ON RIM: BlackBerry 10 devices with carriers, launch on track for Q1 2013
RELATED: Is RIM ruined?

The request for proposals was posted Oct. 22, the day the U.S. Immigration and Customs Enforcement Agency announced it will end its contract with RIM to adopt iPhones instead, according to Reuters.

The DOD is not scrapping its BlackBerries, but expanding the devices it may allow for use. The Reuters story quoted a DOD spokesperson: "DISA is managing an enterprise email capability that continues to support large numbers of RIM devices while moving forward with the department's planned mobile management capability that will support a variety of mobility devices."

One of the companies bidding on the management software contract will be RIM itself, offering its BlackBerry Mobile Fusion application for managing Android and iOS devices.

The DOD's decision shows how dramatically the smartphone and tablet market has changed in the five years since the iPhone was first released. RIM has relied on its vaunted secure network connections, and device and operating system security to become the standard mobile device in many government agencies and security-conscious enterprises. It can no longer do so.

Apple has been steadily improving iOS security and management capabilities, adding on-device encryption, securing each device's unique AES encryption key, and adding programming interfaces for use by mobile device management (MDM) software vendors.


View the original article here

Microsoft launches hosted ALM service

After a year in beta, Microsoft has launched its Team Foundation Service, a hosted version of its application lifecycle management (ALM) software. Its usage, for the time being, has been limited to five or fewer users, however.

"ALM has traditionally been known to be very enterprise heavy, but [this service] could be utilized by people who may not need enterprise scale but could still benefit from tools and services to manage their projects," said Karthik Ravindran, senior director of ALM marketing and management.

[ Learn how to work smarter, not harder with InfoWorld's roundup of all the tips and trends programmers need to know in the Developers' Survival Guide. Download the PDF today! | Keep up with the latest developer news with InfoWorld's Developer World newsletter. ]

There will be no cost for using TFS for five or fewer users, and it can be used for an unlimited number of projects. Subscribers to Microsoft MSDN's premium, ultimate and test pro plans will also get free access along with their subscriptions. Microsoft launched the service in conjunction with its annual developer-focused Build conference this week in Redmond, Washington.

Microsoft did not offer a date for when the service would be available for more than five users, nor how much the service would cost when it will be offered. The costs would be based on a combination of features and usage of computational and networking resources, Ravindran said.

Run on the company's Windows Azure cloud service, Team Foundation Service is a hosted version of the company's Team Foundation Server (TFS) ALM software. The service offers most of the capabilities of TFS, including version control, work item tracking, project planning and management, build automation, and continuous deployment. Building tools are still offered only in preview mode.

TFS supports not only the development of .Net software programs for Windows, but for other languages as well, including Java, PHP, JavaScript and PHP. The service can be incorporated into Microsoft Visual Studio, Eclipse and the Mac-centric Xcode IDEs (integrated development environments).

Microsoft is initially marketing the service to smaller ISVs (independent software vendors) as well as to larger organizations that may want to try ALM without purchasing the software. In the long term, Microsoft will offer the service as a full-scale replacement for on-premises ALM, or to be used in a hybrid mode where code management is shared between in-house servers and cloud services, Ravindran said.

The hosted service does not include all the capabilities of TFS, such as the ability to easily connect the ALM services with in-house deployments of other Microsoft server-based products, such as SharePoint, Ravindran said.

But one advantage that the hosted service would offer over TFS itself is that it is closely tied in with the Microsoft Windows Azure PaaS (platform as a service), Ravindran said. Someone building an Azure-based service can link the hosted ALM service directly with their Azure account, allowing them to "set up a continuous deployment where the bits can be seamlessly deployed into the Azure end-point," Ravindran said.


View the original article here

What's the price of a new Windows 8 zero-day vulnerability?

It's not exactly the type of advertisement most people would understand.

For sale: "Our first 0day for Win8+IE10 with HiASLR/AntiROP/DEP & Prot Mode sandbox bypass (Flash not needed)." It's part of a recent message on Twitter from Vupen, a French company that specializes in finding vulnerabilities in widely used software from companies such as Microsoft, Adobe, Apple, and Oracle.

[ InfoWorld's expert contributors show you how to secure your Web browsers in the "Web Browser Security Deep Dive" PDF guide. Download it today! | Stay up to date on the latest security developments with InfoWorld's Security Adviser blog and Security Central newsletter. ]

Vupen occupies a grayish area of computer security research, selling vulnerabilities to vetted parties in governments and companies but not sharing the details with affected software vendors. The company advocates that its information helps organizations defend themselves from hackers, and in some cases, play offense as well.

Vupen has found a problem somewhere in Microsoft's new Windows 8 operating system and its Internet Explorer 10 browser. The flaw has not been publicly disclosed or fixed by the company yet.

Vupen's finding is one of the first issues for Windows 8, released last week, and Internet Explorer 10, although vulnerabilities have since been found in other third-party software that runs on the Windows 8.

Dave Forstrom, Microsoft's Trustworthy Computing director, said the company encourages researchers to participate in its Coordinated Vulnerability Disclosure program, which asks that people give it time to fix the software problem before publicly disclosing it.

"We saw the tweet, but further details have not been shared with us," Forstrom said in a statement.

Vupen's Twitter message, written on Wednesday, implies the vulnerability would allow a hacker to bypass security technologies contained within Windows 8, including high-entropy Address Space Layout Randomization (ASLR), anti-Return Oriented Programming and DEP (data execution prevention) measures. The company also indicates it is not dependent on a problem with Adobe System's Flash multimedia program.

"Certainly, if the bug is confirmed, then this could be a black eye for Microsoft having their brand new and touted most secure platform already found flawed just after its public release," said Andrew Storms, director of security operations for nCircle.

The market opportunity for a successful exploit may be limited due to the recent release of Windows 8, but "on the other hand, nobody has confirmed this bug isn't also functional on older version of Windows or IE," Storms said.

Jody Melbourne, a penetration tester and senior consultant with the Sydney-based Australian security company HackLabs, said the vulnerability could be useful to third-party Microsoft developers interested in stealing code-signing certificates or source code.

So what's the vulnerability worth? It's hard to say. Vupen doesn't publish a public price list. But Melbourne said "the value of the bug will only increase with time, of course, the longer Vupen sits on it and if no one else stumbles upon it."

Send news tips and comments to jeremy_kirk@idg.com. Follow me on Twitter: @jeremy_kirk.


View the original article here

Following Sandy, DHS seeks security 'Cyber Reserve'

The damage to the electrical grid from Superstorm Sandy is just a taste of what could happen from a major cyber attack, says Department of Homeland Security (DHS) Secretary Janet Napolitano.

And a DHS task force said this week that one way to minimize that kind of risk is to recruit a "Cyber Reserve" of computer security pros that could be deployed throughout the country to help the nation defend and recover from such an attack.

[ Also on InfoWorld: Sandy slams mobile, wired, and cable networks as far west as Michigan. | Prevent corporate data leaks with Roger Grimes' "Data Loss Prevention Deep Dive" PDF expert guide, only from InfoWorld. | Stay up to date on the latest security developments with InfoWorld's Security Central newsletter. ]

Napolitano and other high government officials have been preaching about the escalating threats, particularly from hostile nation states like Iran, Russia and China, for some time.

The Hill reported that at a cyber security event hosted by the Washington Post, Napolitano said while recent news has been about financial institutions being hit with Distributed Denial of Service (DDoS) attacks, the nation's control systems for major infrastructure like utilities and transportation infrastructure were also being targeted.

The Secretary used Hurricane Sandy to make the point. "If you think that a critical systems attack that takes down a utility even for a few hours is not serious, just look at what is happening now that Mother Nature has taken out those utilities," Napolitano said.

[Bill Brenner in Salted Hash: DHS is right to eye kindergartners for future security roles, but don't forget the adults]

Government officials have been invoking the Pearl Harbor image for years. Defense Secretary Leon Panetta did it again just a few weeks ago, saying in a speech in New York that such an attack would, "cause physical destruction and the loss of life. In fact, it would paralyze and shock the nation and create a new, profound sense of vulnerability."

For good measure, he also called it a "pre-9/11 moment."

The security community is divided over the depth of the threat. Most experts say they are real, but not at the level of a catastrophic military attack.

Bruce Schneier, author and chief security technology officer at BT, told CSO Online this year: "Throughout history, the definition of a 'major war' has involved casualties in the hundreds of thousands. That means dead people."

Panetta did invoke the risk of dead people. "[Attackers could]derail passenger trains, or even more dangerous, derail passenger trains loaded with lethal chemicals," he said. "They could contaminate the water supply in major cities, or shut down the power grid across large parts of the country."


View the original article here

Dell testing 64-bit ARM server with chip from AppliedMicro

Dell has built a prototype server based on a 64-bit ARM processor from Applied Micro Circuits, which showed the system at a conference in Silicon Valley on Thursday.

Dell has already said it was testing servers based on 32-bit ARM chips from Marvel and Calxeda, but this is the first time it has shown any hardware based on a 64-bit ARM processor. Sixty-four-bit chips are generally better suited to server use than 32-bit parts.

[ Also on InfoWorld: To survive the PC's decline, Intel just might adopt ARM. | Keep up on the day's tech news headlines with InfoWorld's Today's Headlines: Wrap Up newsletter. ]

Proponents say ARM chips will be more energy efficient than x86 processors that Intel makes for certain cloud and analytics workloads, but the market is in its early stages, with plenty of hardware and software development work to be done. Analysts estimate the first 64-bit ARM servers won't actually hit the market before 2014.

AppliedMicro hosted a session on Thursday at ARM's TechCon conference, where it tried to illustrate how various elements of the 64-bit ARM server "ecosystem" are coming together.

It was joined by representatives from Red Hat and Cloudera, both of whom said they'll have software ready for testing on 64-bit ARM chips next year. Oracle was also there, pledging a version of Java SE for 64-bit ARM processors, though it didn't give a timeframe.

AppliedMicro CEO Paramesh Gopi, in full showman mode, pulled away a black cloth cover to reveal the Dell server at the end of his talk. He didn't describe it in any detail but it appeared to be a two-rack-unit chassis with four or five individual servers, or "sleds," that slide into the frame.

The hardware was a prototype, and it's still unknown if Dell will actually sell an ARM-based server using AppliedMicro technology. Dell is experimenting with ARM components from several suppliers, and it was also at AMD's event Monday when it announced plans to build ARM-based server chips.

"We don't have any plans to make generally available an ARM-based server right now -- that includes the Applied Micro-based prototype you saw," Dell spokeswoman Erin Zehr said via email. "We're currently focused on ecosystem enablement -- giving developers access to clusters so they can test or write to ARM," she said.

The processor inside the Dell system, which AppliedMicro called an "X-Gene" processor, was also an early prototype. Gopi said X-Gene parts will be ready for customers to begin testing in the first quarter next year, with commercial products coming later in 2013.

But AppliedMicro does now have actual prototype silicon, which is a step up from the HotChips conference in August, when it showed a server board with a mock-up chip.

It demonstrated its hardware in action Thursday. It showed a website running on what Gopi said was a prototype X-Gene server built by AppliedMicro and located in a remote data center. He streamed a trailer for the new James Bond film, which appeared to run smoothly.

"We are literally months away, ladies and gentlemen," he said. "In Q1 next year, you'll have not only silicon but also the software I just showed you and systems to go around it." He was still referring to prototype systems, however.


View the original article here

Many Apache Web servers put popular websites at risk

Many Apache Web servers, including those hosting some popular websites, expose information about the internal structure of the sites they host, the IP (Internet Protocol) addresses of their visitors, the resources users access, and other potentially sensitive details because their status pages are left unprotected.

The Apache mod_status module generates a "server status" page that contains information about the server's CPU and memory load, as well as details about active user requests, including paths to various internal files and IP addresses.

[ Learn how to install Apache on Linux in InfoWorld Test Center's step-by-step guide. | Get your websites up to speed with HTML5 today using the techniques in InfoWorld's HTML5 Deep Dive PDF how-to report. ]

While this page can be a valuable resource for server administrators, the information it exposes can help hackers better plan their attacks, Daniel Cid, chief technology officer of Web security firm Sucuri, said Tuesday in a blog post.

Sucuri researchers ran a test that involved crawling over 10 million websites and found hundreds of them that expose their server status pages to the whole world. The list of affected websites includes php.net, metacafe.com, disney.go.com, staples.com, nba.com, cisco.com, ford.com, apache.org and many others. Some of them have fixed the problem since Sucuri's report, but many haven't.

"Is that a big deal that I can go to staples.com/server-status/ and see all those orders/connections being made and their IPs?" Cid said. "Or go to one of them and search for 'admin-p' and find a mostly unprotected admin panel (I won't disclose the site). Or find all the internal URLs and vhost mapping for nba.com or ford.com?"

"Probably not a big deal by itself (well, if you don't have an unprotected admin panel), but that can help attackers easily find more information about these environments and use them for more complex attacks," the security researcher said.

At first glance, the solution is simple: add access control directives in the server configuration file in order to restrict access to the /server-status path and only allow access to IP addresses that need to have access to the page.

However, server administrators need to consider the configuration of their whole Web infrastructure, because there are some scenarios in which the Apache access control directives could be inadvertently bypassed.

For example, if a Squid Web caching service is running in reverse proxy mode on the same machine as a Web server that allows /server-status access only from the local IP address, the restriction will be bypassed, Marcus Povey, an independent IT consultant and software developer from U.K., said Thursday via email.

This happens because requests from users are received by the Squid proxy first and then passed to the Web server, causing the server to see the requests as coming from the proxy's local IP address.

When running in this configuration, Squid stores static versions of pages generated by the Web server for a limited period of time and serves them to users, which prevents the server from overloading when dealing with a lot of traffic. Without caching services like Squid, the Apache server would have to regenerate PHP or other dynamic pages for every visitor, which can quickly consume the machine's available resources during a traffic spike.

Running Squid and Apache on the same machine is particularly popular with owners of smaller websites, who can't afford running these services on separate machines, Povey said. "Many, myself included, just have a rented box in a data center somewhere."

Larger companies might run dedicated Web caching servers. However, even in those cases, if the Apache access control rules for /server-status allow access to the whole IP range of the internal network, which includes the caching servers, the same problem would occur.

"You'd have to take extra steps to ensure you didn't expose this," Povey said. "Server-status et al [other server info pages like the one generated by mod_info] is something that is easy to overlook."


View the original article here

Facebook, others back Linaro's Linux-on-ARM project

Facebook, Red Hat, Hewlett-Packard and other big vendors have joined a project to develop Linux OS software for the upcoming generation of ARM-based servers, the companies announced Thursday.

Advanced Micro Devices, Applied Micro, Calxeda, Canonical, Cavium, and Marvell are among the other companies to join the Linaro Enterprise Group within Linaro, a not-for-profit, multivendor engineering group. They join existing members ARM, HiSilicon, Samsung and ST-Ericsson.

[ Prove your expertise with the free OS in InfoWorld's Linux admin IQ test round 1 and round 2. | Track the latest trends in open source with InfoWorld's Open Sources blog and Technology: Open Source newsletter. ]

Building an ecosystem of software and hardware is seen as essential for ARM-based servers to succeed, and Linux is a big part of that. ARM servers are expected to be adopted initially by big online service providers, many of whom rely on Linux in their operations.

In an announcement Thursday, the Linaro Enterprise Group said it will initially work on low-level Linux boot architecture and kernel software for use by system-on-chip vendors, commercial Linux providers and server manufacturers.

They said they expect to deliver some software before the end of 2012, with follow-up releases after that. Indeed, Linaro and ARM have already been working together to release early Linux code for the Armv8 architecture.

ARM CEO Warren East was expected to discuss the news further on Thursday morning at ARM TechCon, joined by George Grey, CEO of Linaro.

"Linaro is building a high-quality software engineering team that is working with our members on the development of key enabling software for the new generation of low-power, high-performance, hyperscale servers," Grey said in a statement.

James Niccolai covers data centers and general technology news for IDG News Service. Follow James on Twitter at @jniccolai. James's e-mail address is james_niccolai@idg.com


View the original article here

Firefox to force secure connections for selected domains

Mozilla introduced a pre-loaded list of domains for Firefox that only can be connected to securely in order to help protect the privacy and security of users.

To force secure connections between the browser and a server, Mozilla uses HSTS (HTTP Strict Transport Security), a mechanism used by servers to indicate that the connecting browser must use a secure connection, wrote Mozilla's David Keeler in a blog post.

[ Learn how to protect your systems with Roger Grimes' Security Adviser blog and Security Central newsletter, both from InfoWorld. ]

When the browser connects to an HSTS server for the first time though, the browser does not know if it should use a secure connection because it never received a HSTS header from that host. "Consequently, an active network attacker could prevent the browser from ever connecting securely (and even worse, the user may never realize something is amiss)", Keeler wrote, adding that setting up the connection that way still leaves it vulnerable to attacks.

As a workaround for that problem, Mozilla has added a list to Firefox with domains that the browser should only connect to securely by default.

"When a user connects to one of these hosts for the first time, the browser will know that it must use a secure connection. If a network attacker prevents secure connections to the server, the browser will not attempt to connect over an insecure protocol, thus maintaining the user's security," Keeler said.

The list has been seeded by domains from Chrome's HSTS preloaded list, that has a similar function to Mozilla's. Google's Chrome forces a secure connection for all google.com subdomains but also added forced HTTPS connections for sites that have requested it. Secure connections are forced for sites such as paypal.com, twitter.com, lastpass.com and torproject.org.

"HSTS in combination with a preloaded list of sites can be a great tool for increasing the security of users," Keeler wrote. The feature is currently only present in Firefox Beta.

Loek is Amsterdam Correspondent and covers online privacy, intellectual property, open-source and online payment issues for the IDG News Service. Follow him on Twitter at @loekessers or email tips and comments to loek_essers@idg.com.


View the original article here

10 signs it may be time to quit your IT job

Working as an IT professional is a demanding and competitive role, and like anything in life has its ups and downs. How many employers does the average IT pro work for in his or her career? It seems like a straight-forward question, but the answer is surprisingly complex because the Bureau of Labor Statistics (BLS) doesn't track this data. Even if it did, you'd have to ask what constitutes a career change? Suffice to say the average IT pro will at least change employers a few times over a 20-30 year career.

Most experienced IT professionals can relate to having worked too long in at least one job over the course of his career. It's amazing how sometimes a great job can seemingly change overnight into a caustic situation. Other times, the epiphany is more of a slow burn that builds up over time.

[ Also on InfoWorld: Steer clear of these 20 IT gotchas for the sake of your career. | Get sage advice on IT careers and management from Bob Lewis in InfoWorld's Advice Line blog and newsletter. ]

Moving on can be necessary at times for professional development, financial or even health reasons. So how can you tell whether that feeling is all in your head or whether it really is time to look for greener pastures?

CIO.com spoke with industry professionals to help spot the warning signs that indicate it may be time to quit your day job.

1. The Numbers Don't Add Up

This can be a number of things: Perhaps you notice vendor bills that are normally paid on time aren't any longer or maybe you notice the stock price tumbling. "While one bad quarter does not make a bad company, if you are seeing quarter over quarter trending downward, that is a compelling sign that the company may be in trouble and your job may be at risk," says Dave Sanford, executive vice president at WinterWyman, a career management and transition services consulting firm.

2. Where's My Money?

If you aren't getting paid on time, it's likely your company is navigating some dangerous financial straits. Dusting off your resume and preparing for the worst can only be in your best interest.

3. Your Company Doesn't Invest in Its Employees

When you feel like there is room for advancement and your company is supportive when it comes to professional development, it shows in your work. You are compelled to do your best. The opposite is true for companies that don't create clear advancement paths or don't help their employees grow professionally.

4. You Dread Going to Work Each Day

If your passion is gone, this is an obvious sign that something is wrong. Whether it's long bouts of boredom or a boss from hell, working in a situation like this is not good for you professionally or for your health.

If going to work ties you up in a knot or you're losing sleep thinking about work, this is a strong indicator that it may be time to start looking. "Everyone deserves to find a job that meets their values, the opportunity to use and excel in your top skill set and doing something that you love. Once your ability to get a solid night's sleep doesn't occur, you are not performing at your best," says Jayne Mattson, senior vice president at Keystone Associates, a company that specializes in helping mid-to-senior level individuals in new career exploration, networking strategies and career decisions.

5. Your Company Has Undefined Goals or Isn't Keeping Up With the Industry


View the original article here