Wednesday, September 30, 2015

As containers take off, so do security concerns

Containers offer a quick and easy way to package up applications but security is becoming a real concern

Containers offer a quick and easy way to package up applications and all their dependencies, and are popular with testing and development.

According to a recent survey sponsored by container data management company Cluster HQ, 73 percent of enterprises are currently using containers for development and testing, but only 39 percent are using them in a production environment.

But this is changing, with 65 percent saying that they plan to use containers in production in the next 12 months, and cited security as their biggest worry. According to the survey, just over 60 percent said that security was either a major or a moderate barrier to adoption.
MORE ON CSO: The things end users do that drive security teams crazy

Containers can be run within virtual machines or on traditional servers. The idea is somewhat similar to that of a virtual machine itself, except that while a virtual machine includes a full copy of the operating system, a container does not, making them faster and easier to load up.

The downside is that containers are less isolated from one another than virtual machines are. In addition, because containers are an easy way to package and distribute applications, many are doing just that -- but not all the containers available on the web can be trusted, and not all libraries and components included in those containers are patched and up-to-date.

According to a recent Red Hat survey, 67 percent of organizations plan to begin using containers in production environments over the next two years, but 60 percent said that they were concerned about security issues.
Isolated, but not isolated enough

Although containers are not as completely isolated from one another as virtual machines, they are more secure than just running applications by themselves.

"Your application is really more secure when it's running inside a Docker container," said Nathan McCauley, director of security at Docker, which currently dominates the container market.
MORE ON NETWORK WORLD: 12 Free Cloud Storage options

According to the Cluster HQ survey, 92 percent of organizations are using or considering Docker containers, followed by LXC at 32 percent and Rocket at 21 percent.

Since the technology was first launched, McCauley said, Docker containers have had built-in security features such as the ability to limit what an application can do inside a container. For example, companies can set up read-only containers.

Containers also use name spaces by default, he said, which prevent applications from being able to see other containers on the same machine.

"You can't attack something else because you don't even know it exists," he said. "You can even get a handle on another process on the machine, because you don't even know it's there."
Resources

White Paper
Buying into Mobile Security
White Paper
How secure is your email? Prevent Phishing & Protect Your Customers Post Data Breach

See All

However, container isolation doesn't go far enough, said Simon Crosby, co-founder and CTO at security vendor Bromium.

"Containers do not make a promise of providing resilient, multi-tenant isolation," he said. "It is possible for malicious code to escape from a container to attack the operation system or the other containers on the machine."

If a company isn't looking to get maximum efficiency out of its containers, however, it can run just one container per virtual machine.

This is the case with Nashua, NH-based Pneuron, which uses containers to distribute its business application building blocks to customers.

"We wanted to have assigned resourcing in a virtual machine to be usable by a specific container, rather than having two containers fight for a shared set of resources," said Tom Fountain, the company's CTO. "We think it's simpler at the administrative level."

Plus, this gives the application a second layer of security, he said.

"The ability to configure a particular virtual machine will provide a layer of insulation and security," he said. "Then when we're deployed inside that virtual machine then there's one layer of security that's put around the container, and then within our own container we have additional layers of security as well."

But the typical use case is multiple containers inside a single machine, according to a survey of IT professionals released Wednesday by container security vendor Twistlock.

Only 15 percent of organizations run one container per virtual machine. The majority of the respondents, 62 percent, said that their companies run multiple containers on a single virtual machine, and 28 percent run containers on bare metal.

And the isolation issue is still not figured out, said Josh Bressers, security product manager at Red Hat.

"Every container is sharing the same kernel," he said. "So if someone can leverage a security flaw to get inside the kernel, they can get into all the other containers running that kernel. But I'm confident we will solve it at some point."

Bressers recommended that when companies think about container security, they apply the same principles as they would apply to a naked, non-containerized application -- not the principles they would apply to a virtual machine.

"Some people think that containers are more secure than they are," he said.
Vulnerable images

McCauley said that Docker is also working to address another security issue related to containers -- that of untrusted content.

According to BanyanOps, a container technology company currently in private beta, more than 30 percent of containers distributed in the official repositories have high priority security vulnerabilities such as Shellshock and Heartbleed.

Outside the official repositories, that number jumps to about 40 percent.

Of the images created this year and distributed in the official repositories, 74 percent had high or medium priority vulnerabilities.

"In other words, three out of every four images created this year have vulnerabilities that are relatively easy to exploit with a potentially high impact," wrote founder Yoshio Turner in the report.

In August, Docker announced the release of the Docker Content Trust, a new feature in the container engine that makes it possible to verify the publisher of

"It provides cryptographic guarantees and really leapfrogs all other secure software distribution mechanisms," Docker's McCauley said. "It provides a solid basis for the content you pull down, so that you know that it came from the folks you expect it to come from."

Red Hat, for example, which has its own container repository, signs its containers, said Red Hat's Bressers.

"We say, this container came from Red Hat, we know what's in it, and it's been updated appropriately," he said. "People think they can just download random containers off the Internet and run them. That's not smart. If you're running untrusted containers, you can get yourself in trouble. And even if it's a trusted container, make sure you have security updates installed."

According to Docker's McCauley, existing security tools should be able to work on containers the same way as they do on regular applications, and also recommended that companies deploy Linux security best practices.

Earlier this year Docker, in partnership with the Center for Information Security, published a detailed security benchmark best practices document, and a tool called Docker Bench that checks host machines against these recommendations and generates a status report.

However, for production deployment, organizations need tools that they can use that are similar to the management and security tools that already exist for virtualization, said Eric Chiu, president and co-founder at virtualization security vendor HyTrust.

"Role-based access controls, audit-quality logging and monitoring, encryption of data, hardening of the containers -- all these are going to be required," he said.

In addition, container technology makes it difficult to see what's going on, experts say, and legacy systems can't cut it.

"Lack of visibility into containers can mean that it is harder to observe and manage what is happening inside of them," said Loris Degioanni, CEO at Sysdig, one of the new vendors offering container management tools.

Another new vendor in this space is Twistlock, which came out of stealth mode in May.

"Once your developers start to run containers, IT and IT security suddenly becomes blind to a lot of things that happen," said Chenxi Wang, the company's chief strategy officer.

Say, for example, you want to run anti-virus software. According to Wang, it won't run inside the container itself, and if it's running outside the container, on the virtual machine, it can't see into the container.

Twistlock provides tools that can add security at multiple points. It can scan a company's repository of containers, it can scan containers just as they are loaded and prevent vulnerable containers from launching.

"For example, if the application inside the container is allowed to run as root, we can say that it's a violation of policy and stop it from running," she said.

Twistlock can monitor whether a container is communicating with known command-and-control hosts and either report it, cut off the communication channel, or shut down the container altogether.

And the company also monitors communications between the container and the underlying Docker infrastructure, to detect applications that are trying to issue privileged commands or otherwise tunnel out of the container.

Market outlook

According to IDC analyst Gary Chen, container technology is still new that most companies are still figuring out what value they offer and how they're going to use them.

"Today, it's not really a big market," he said. "It's still really early in the game. Security is something you need once you start to put containers into operations."

That will change once containers get more widely deployed.

"I wouldn't be surprised if the big guys eventually got into this marketplace," he said.

More than 800 million containers have been downloaded so far by tens of thousands of enterprises, according to Docker.

But it's hard to calculate the dollar value of this market, said Joerg Fritsch, research director for security and risk management at research firm Gartner.

"Docker has not yet found a way to monetize their software," he said, and there are very few other vendors offering services in this space. He estimates the market size to be around $200 million or $300 million, much of it from just a single services vendor, Odin, formerly the service provider part of virtualization company Parallels.

With the exception of Odin, most of the vendors in this space, including Docker itself, are relatively new startups, he said, and there are few commercial management and security tools available for enterprise customers.

"When you buy from startups you always have this business risk, that a startup will change its identity on the way," Firtsch said.

Tuesday, September 22, 2015

CompTIA Server+ Certification Training 2015

CompTIA's Server+ 2015 is a vendor-neutral certification that deals with every aspect of the "care and feeding" of server computers. While nearly any computer can be used as a server in a small networking environment, many organizations require dedicated network servers built to high performance specifications. These powerful machines are called upon to handle hundreds (if not thousands) of user accounts, and all of the network activity and requests generated by these users. Additionally, there's a variety of specialized servers (e.g. database servers, file and print servers, web servers, etc.) that can be deployed to perform critical roles in organizations.

The Server+ cert is aimed at technicians (ideally with a CompTIA A+ cert) who have 18 to 24 months of professional experience working with server hardware and software. The Server+ cert was developed in consultation with several industry partners, and is recommended or required for server technicians who work for Dell, HP, IBM, Intel, Lenovo and Xerox. First released in 2001, the Server+ exam was updated in 2005, and again in 2009.

Server+ training

There are a number of different training options available for CompTIA's Server+ 2015. For students on a budget, the most affordable option involves the use of printed self-study manuals. These self-paced books are a good option for candidates who have access to a test lab outfitted with computer server hardware and software, and who feel confident in their ability to teach themselves material from texts. Self-study manuals can also give candidates the most flexibility when scheduling training sessions for themselves.

Server+ self-study manuals are available from several vendors. Students should shop online in order to find the best pricing on these materials.

Self-study
Candidates who prefer more dynamic training should look at self-paced video courseware. This form of training uses video lessons on optical disks, or may be offered through an online streaming video subscription service. Some of the vendors who create training manuals also create video courseware, and will often bundle the two products together. Self-paced video courseware can be more engaging than printed materials alone, while still offering the same flexibility when it comes to scheduling lessons.

Instructor-led training for Server+ is the most expensive option available, but offers the most beneficial learning experience to students who need interaction with a live instructor in order to learn new material. Instructor-led training can be purchased as virtual classroom courses delivered over the Internet, or traditional classroom courses held at a technical school.

Online courses
Virtual classroom courses use special client software or a web browser plug-in to simultaneously connect several students to an online classroom, which is managed by a live instructor. Virtual classroom courses are a good option for students who live a great distance from a technical school, or who have any conditions that make it difficult for them to travel to a physical classroom. These classes take place in real-time, so candidates must be able to work them into their existing schedules.
Traditional classroom

Finally, there are traditional classroom courses. For some, this training option offers the best learning experience: a live instructor, other students to collaborate with, and (by most schools) access to all of the relevant hardware and software labs necessary to master Server+ course content.

Here are the most common subjects a Server+ student can expect to encounter, no matter which training option they select:

Identifying and configuring server hardware components
Installing and configuring a network operating system
Server security fundamentals
Server-based storage technologies
Disaster recovery and contingency planning
Server troubleshooting tools and techniques

Server+ certification exam
There are no prerequisites for taking the Server+ exam, although CompTIA recommends that candidates should have their A+ certification, and somewhere between 18 and 24 months experience working with server computer hardware and software. The Server+ exam can be booked and taken at any authorized CompTIA exam center. As of this writing, the current Server+ exam code is SK0-003. The exam is available in English, Chinese, German and Japanese.

The Server+ exam is made up of 100 multiple-choice questions. Candidates have 90 minutes to complete the exam. The passing score for the exam is 750 on a scale of 100-900, and candidates are informed immediately upon exam completion if they have passed or not.

Here's a list of the Server+ exam knowledge domains, with an estimate of how much exam content is dedicated to each:

System Hardware (21%)
Software (19%)
Storage (14%)
IT Environment (11%)
Disaster Recovery (11%)
Troubleshooting (24%)

Server+ in the workplace
The Server+ cert is valid for three years once it has been awarded by CompTIA. Candidates can renew the Server+ by earning a set total of CompTIA Continuing Education Units (CEUs) during the three-year certified period. CompTIA CEUs are attained by earning additional CompTIA certs, or can be gained by participating in certain approved industry activities. For more information about the CompTIA Continuing Education Program, visit the CompTIA Certification website.

If the Server+ is allowed to expire, the exam must be passed again in order to re-certify.

Some of the job roles associated with the Server+ certification include the following:
Authorized Server Technician
Server Sales Specialist
Network Server Support Specialist
Application Server Specialist
Web Server Specialist


Tuesday, September 1, 2015

VMware rounds out data center virtualization stack

VMware has added more components to its software-defined data center, updating vCloud, NSX and its OpenStack distribution

VMware has updated its stack of data center virtualization software, rounding out capabilities that allow an organization to run an entire data center operation and related cloud services as a single unified entity.

Among the new additions are components to the vCloud Air suite of software for running cloud services. The company has expanded its network virtualization software to provide more ways of moving a workload across a data center. And it has also released a new version of its OpenStack distribution for running cloud workloads.

VMware's vCloud Air is the company's answer to the success of cloud service providers such as Amazon Web Services. The software lets organizations run their own IT operations as a set of cloud services. It also provides a unified base for multiple cloud service providers to offer vCloud services that interoperate with each other as well as with customer's internal vCloud deployments.

The VMware vCloud Air now has a number of new options for storing data, such as vCloud Air Object Storage for storing unstructured data. It features built-in redundancy, eliminating the need to make backups. The data can be accessed from anywhere in the globe as well.

The company also has a new database-as-a-service, called vCloud Air SQL, which provides the ability to store relational data on a pay-as-you-go model. Initially, vCloud Air SQL will be compatible with Microsoft SQL Server, but plans are to make it compatible with other relational databases.

The company has updated its VMware vCloud Air Disaster Recovery Services, which provide a way to ensure that operations continue even if the enterprise's data center goes offline. It now has a new management console for testing, executing and orchestrating disaster recovery plans.

VMware also updated its software for virtualizing network operations. VMware NSX 6.2 allows a virtual machine to be copied across a single data center, or even two different data centers, while retaining its networking and security settings.

NSX 6.2 now can recognize switches through the Open vSwitch Database (OVSDB) protocol, providing new ways for the users of such switches to segment their physical servers into smaller working groups. VMware NSX 6.2 also has a new central command line interface and a set of troubleshooting capabilities, called TraceFlow.

VMware says NSX is now being used by more than 700 customers, with over 100 cases being used in production deployments.

VMware vRealize Operations, which provides a single interface to watch the operational health of applications running on VMware, has been updated to include capabilities to find the best set of resources within a data center to place a workload. It also does rebalancing to move workloads around for most efficient use of data center resources.

Also on the management side, the company has updated its logging software, which is now capable of ingesting 15,000 messages per second. The software also now offers new ways to chart and search through operational data.

The newly released VMware Integrated OpenStack 2 is based on the latest release of the open source OpenStack software, which was codenamed Kilo and released in April. The new release has a load-balancing feature as well as the ability to automatically scale up workloads should they require more resources.