Optimize your WordPress UX Steps

Posted by & filed under List Posts.

In the first part of this two-part series, we looked at why user experience (UX) is so important, along with general steps you can take to improve it within your WordPress environment. Examples are functionality and design. This second part examines more specific steps you can take to improve UX – such as fixing 404 errors and installing various UX-oriented WP plugins. It closes with a few common UX mistakes to avoid.

Fix any issues with 404 errors.

One very simple step to take is to watch out for any dead ends – 404 errors – as indicated by WordPress theme firm WPion. If a user sees a 404 error, there is no chance you will retain them as a visitor. After all, there are probably no images or content for them to see (unless you have an incredible 404 page). You want all your web pages to function as you want them to. Your UX will suffer if people see these pages.

Use best practices for carousels.

One way to improve the UX of the site is with a carousel or slider, so that you can get your message above the fold (i.e., in the part of the website that you immediately see prior to scrolling) and in front of the potential customer.

The reason that many people like to use a carousel on their WP site is that it:

  • lets you show off your most important messaging in a central location;
  • allows you to prioritize as you place the messages in your preferred order; and
  • allows you to be diverse in your messaging and have a bit of appeal for everyone.

While those reasons are compelling, many people do not believe in the UX of this approach for 2 reasons, noted WP theme site ThimPress:

  • users often do not have the patience to wait for the slides to go through, perhaps looking at one image and moving on; and
  • carousels are not search-engine friendly, because it is not easy for search spiders to crawl them.

For the best possible UX if you do implement carousels, use these tips:

  • Turn off any auto-forwarding function;
  • Allow the user to control slider navigation; and
  • Limit your slides to a maximum of 5.

Strike a balance with white space.

It makes it easier to digest your content when there is sufficient white space. These empty areas allows you to demarcate different areas where you want the eye to go. It is especially important to have substantial white space when you are trying to provide a large amount of information above the fold. Creating the right balance of white space is a challenging aspect of web design. However, the general idea is that a larger amount of content will naturally lead to a larger amount of white space.

Upgrade your navigation’s sophistication with mega menus.

Tools or functionality within WordPress geared toward highly advanced menu creation are loosely called mega menus. Mega menus are a logical next step beyond the primary navigation in the header and secondary navigation in the sidebar – and they make sense in the context of sizable blogs or many ecommerce product pages.

The reason mega menus are increasingly popular is that there is uniform visibility, with everything included, even content that is on lower tiers. Since everything can be seen right away, the UX is improved because the user is able to get to your less frequently visited materials more quickly.

On the downside, said ThimPress, mega menus will:

  • sometimes slow down the site, particularly when you use icons or graphics;
  • be useless for a site that does not apply the WP taxonomy well to hierarchize content; and
  • be difficult to make 100% responsive for the best mobile performance.

Beware of modal pop-ups.

You of course want to convert as many of the visitors to your site as possible – but be careful not to drive them away by being too aggressive.

Interruption marketing, otherwise known as the modal pop-up, can be a great way to boost sales. Many, many plugins now make it possible to interrupt the user in this manner. You can let them know about a discount, have them sign up or subscribe, remind them to like a social media post, etc.

This method is far from perfect. The reason people use it and consider it a UX improvement is simply that it raises their conversions.

Reasons that a modal pop-up can be the worst thing you can do for user experience are that:

  • the popup can get in the way when your visitor is interested in digesting content, such as an explanatory article;
  • the popup will typically feel like it simply a distraction and will often be ignored unless it harmonizes with the journey of the user; and
  • accessibility is inevitably damaged by these devices.

Select plugins to improve your usability.

Perhaps it is a no-brainer that you can solve many problems on WordPress with plugins. While these add-ons may be immediately available, simply wading through them and figuring out what you need, and what is well-maintained, can be a huge process. Neil Patel is a thought-leader both in marketing and in user experience, so his suggestions are interesting on this topic. Here are the 11 plugins that Patel suggested for UX:

  1. Contact Form 7 – KoMarketing released a poll in 2015 that revealed nearly half of respondents (44%) had left a site because there was no contact information available. This plugin simplifies the creation of a contact form.
  2. What Would Seth Godin Do – Using cookies, this plugin displays different welcome messages to returning visitors than you display to new ones.
  3. Google Analytics Dashboard for WP – This plugin gives you a quick overview of your main stats right in your WP admin portal.
  4. W3 Total Cache – This plugin allows you to keep the user’s browser from having to re-download redundant data.
  5. WP Smush.it – You want to minimize your images for strong performance. This plugin makes that easy.
  6. WPtouch Mobile Plugin – Making it simple to become mobile-friendly, this website addresses an increasing concern.
  7. Broken Link Checker – The average lifespan of a webpage is 100 days, said Patel. Broken links should be addressed. This plugin notifies you of them over time.
  8. CrazyEgg Heatmap Tracking – This plugin shows you how users interact with your site through visual heat-mapping. (Notably, this plugin is from Patel’s company.)
  9. Better Click to Tweet – This plugin allows you to tweet little quotes from within content.
  10. WP Live Chat Support – This plugin offers you the option for free live chat.
  11. P3 Profiler – This plugin gives you a report on the load times and similar information for all your other plugins.

Avoid common usability mistakes.

Here are some frequently made mistakes that would cause your UX to suffer:

  • Choosing diversity over consistency – Be consistent. Stick with your color scheme and fonts.
  • Sacrificing UX and only focusing on speed – You want your site to be fast, both through your own optimization and with high-performance infrastructure. However, speed must be paired with function – so assess your UX in conjunction with speed for optimal engagement.
  • Not paying attention to industry standards – Be certain that what you are doing on WordPress applies to the niche you are serving. Strong UX is about moving away from a one-size-fits-all approach.
  • Placing emphasis in the wrong directions – Research cited by WP Mayor (also the source of these common mistakes) demonstrates the order in which the typical user scans a page and, in turn, the priority of site elements. The eye goes to elements on the page as follows: logo, navigation menu, search box, social media links, main image, text content, and bottom content. Use this knowledge to guide design in a conscientiously prioritized manner.
  • Trying to meet numerous objectives simultaneously – Meet that one goal that the user has for your site, rather than trying to be all things for all people.

Strong performance for strong UX

Going beyond performance to broadly consider functional usability of your site is key. However, as indicated by that last plugin from Patel and from other elements above, the speed and reliability of your WordPress environment will be key to delivering strong user experience. At Total Server Solutions, our infrastructure enables our cloud to realize performance that other providers can only dream about. See our high-performance infrastructure.

improving your WordPress UX - people worldwide

Posted by & filed under List Posts.

There would seem to be two sides to user experience – that of the user doing the experiencing, and that of the service provider attempting to create a certain experience or reaction in the user. However, it is a little more complex than that, since there are actually multiple parties on the side of crafting a strong user experience, since UX is so key to success with a site (see below). Since UX is so fundamental, it is the concern of every business – so companies that leverage WordPress as part of their IT ecosystem want to optimize the content management system (CMS) to create greater UX. WordPress itself wants the user experience of its system to be strong. There are layers of UX in this sense, with each of two providers (at the level of the CMS developer and at the level of the individual website) boosting this factor.

Beyond the different players, there is another way to categorize UX: in terms of general ways that you can improve the site and specific features that are available within the platform. This two-part series looks at some of the general ways to approach UX and specific tools that you can use to do so, tapping the perspectives both of the WordPress community and WordPress itself.

Before reviewing steps you can take, broadly and specifically, to improve your site’s UX, it helps to first look at why it is so important.

Who cares? Is UX really important?

Usability, or user experience, is a core element of IT that is often discussed but not necessarily well-understood. The reason it stays so central is that the experience of the user is understood to be essential. Strong UX will improve outcomes in the following ways, according to the Usability Professionals Association (UPA):

  • More sales
  • Better productivity
  • Lower expense for support and training
  • Lower expense and time for development
  • Lower expense for maintenance
  • Stronger customer satisfaction.

Statistics from industry research are also telling. The findings from a 2016 Forrester Research report reveal how powerful user experience is. The analysis found that by improving the UX of the user interface, conversion could rise as much as 200%; even more shocking, by improving the overall UX of the website, conversion could go up as much as 400%.

4 general ways to improve UX

Here are four ways you can improve your user experience:

1.) Work on design.

You want the design of your site to be uncluttered and easy. It is a common issue for web designers to get excessively focused with the aesthetics and forget about how the site will be used. Setting aside the needs of the user to try to impress them with appearance is a huge mistake, as Tom Ewer explains in WPExplorer, noting that “a visitor almost always prefers function over form.” In that sense, it is important to always be thinking about what might be more obvious for the user so that they do not get confused.

Great web design may be interesting to people. However, since websites are (in almost all cases) fundamentally tools that serve a function, predictability is positive for that core need. The person using the site wants to be able to quickly understand a site, in part by seeing capabilities in certain locations, so they can do what they want to do.

Since predictability is so important, good design for UX should follow established standards. Ewer points out that there are various ways you can tweak a standard WP blog design to make it more appealing.

When you do go with a simple design for better UX, beautify it with compelling typography and strong colors. You want the look to be captivating and intriguing yet to remain easy and straightforward. Your function and form should be balanced – but again, with most of the weight on function.

At the same time that you do not want the design to get in the way of people being able to use your site effectively, you still want it to be appealing.

Making the design appealing while also clear will lead to better UX.

2.) Focus on functionality.

As indicated above, you want the layout of the site to be geared toward fulfilling the needs of the user. To be clear, we are not talking about performance, which can be improved with speed optimization tactics and by using high-performance infrastructure. When you think about your functionality, think about what you website is able to provide to visitors. They need to be able to get around your site and do what they need to do.

As such, navigation is core to functionality.

The user should be able to simply make their way through your site, using simple navigation components. Those elements of your site could be:

  • a sidebar search box
  • a sidebar categories list
  • an archives page
  • breadcrumbs
  • a navigation bar.

Almost all sites will have that navigation bar if not the other elements. The archives page is especially helpful if you organize it by tag, category, and date.

3.) Make your content sing.

You want your content to be, above all, relevant to the users of your site. Relevance is the key point related to UX because the user does not want to have to wade through information that is nothing but a distraction.

Content is not just about the information, though, and that is true even when it comes to UX: you also want it to look great.

To create better user engagement, two forks of your approach should be formatting and typography.

When you look at format, here are aspects to consider:

  • You want your sentences and paragraphs to be as short as possible so that everything is easily digestible.
  • Use to-the-point, ample subheadings.
  • Important aspects of the article or important terms should be bolded.
  • Incorporate italics for emphasis of specific terms.
  • Introduce images so you are not just reliant on text.
  • Use blockquotes, tables, or lists to enhance readability and create pauses in the reading.

To cover the second base, particularly think in terms of your choice of font, along with its color and size.

4.) Use your homepage as a starting point.

Think of your homepage as the entrance to your business, noted Madan Pariyar of Theme-Vision.

The homepage of your site is usually more visited than any other. It will get more links than any other page. People will also typically visit the homepage second if they visit another page first.

Since the homepage is such a doorway to your site, you can greatly improve your UX by focusing on how the user interacts on that page.

Ewer recommends two components to improve homepage UX:

  • Feature box – This site element can go at the top of your website and provides an immediate sense of what your site provides. This box can be an opt-in form if you want. A feature box clarifies what you do.
  • Start Here page – A Start Here page gives an idea of what is good about your site and provides a step-by-step quick guide to interacting with the site. It’s hand-holding, basically; and that is often what people want.

Note: The second part of this series will be linked here soon.

High-performance WordPress hosting

Do you want great UX for your WordPress site? Beyond what you can do on your own, you will also need strong infrastructure to power speed and reliability so your users experience incredible performance. At Total Server Solutions, our passion is enabling you to say what you need to say and keep your message in front of your audience. See our philosophy.

Web Hosting Continues to Grow, Especially Cloud [Market Research]

Posted by & filed under List Posts.

Web hosting is growing at a fast rate. However, looking specifically at cloud hosting shows us the segment that is really fueling the current and ongoing growth of the industry.

 

Hosting market overview, e-commerce example & types

 

The web hosting industry supplies solutions that enable individuals and companies to run their websites, applications, and other IT systems without having to build, staff, or maintain an internal data center. A web hosting services provider, commonly shortened to web host or simply host, provides both the hardware (the physical back-end equipment that provides IT resources such as processing power) and service that allow your website to populate online. Web hosting is a way for you to keep your site up for public visibility, as well as providing a place for storage and access of your data that is protected by security specialists. Another key element of web hosting is a development and testing environment. There are numerous other functions provided by a data center, extending beyond IT to broader areas such as disaster recovery, business continuity, and compliance.

 

The digital assets of the organization that pays for a web hosting service are stored on servers. Servers can be physical (actual physical computers) or virtual (digital constructs that are either portions of single physical machines or portions of groups of physical machines); either way, the hardware is the basis of hosting.

 

The growth of e-commerce is in large part due to the availability of web hosting services. Owners of online businesses are able to have someone else handle the infrastructural and maintenance aspects of their IT systems, keeping their site and databases safe, while focusing on their core business.

 

E-commerce businesses can add shopping carts and other features to their web presence, such as blogs, forums and chat environments. Additional features are added to make the business more robust in its visibility and to answer customer questions through means beyond direct customer service. They allow companies to improve their customer relationships and gather analytics on user behavior to provide better value over time.

 

Typically the web hosting services market is divided into segments by type – or the specific kind of hosting provided. Types of hosting include dedicated hosting, VPS hosting, shared hosting, cloud hosting, colocation, and website builders.

 

Dedicated hosting (whether managed or unmanaged) means that physical servers are being set aside for the use of one organization. A dedicated server, within the context of any organization, means that a single machine is being used for one dedicated purpose; the same applies with dedicated hosting.

 

The alternative to a dedicated server internally has often been a virtual private server (VPS). A VPS is a form of virtualization that is at still at the level of an individual machine. VPS hosting is still available and can be set up as desired by a full services infrastructure company. However, for the most part, this model has been replaced by cloud – since cloud allows better performance and efficiency.

 

Cloud hosting is hosting through a virtual server that is constructed out of resources from numerous physical machines, allowing for optimal resiliency, speed, and redundancy.

 

Shared hosting is the low-grade, low-budget form of hosting. There are known, major security problems with shared hosting. It is often compared to living in an apartment building. Security concerns go beyond that analogy though, with InfoSec Institute noting, “[I]f your site is hosted on a shared hosting server, it is only as secure as the site with the weakest security on the same server.” Plus, all accounts within a server used for shared hosting are sharing the same resources – disk space, memory, and CPU. You may need resources, and they aren’t available. In contrast, with a strong cloud plan, you will have a guaranteed level of performance. Security of cloud is also often praised (see Quentin Hardy and David Linthicum).

 

Colocation is the renting of data center space for hardware that is already owned by an organization.

 

Website builders are also included within the hosting category since those are sometimes used by hosting services, in conjunction with infrastructure.

 

Growth projections for web hosting

 

Web hosting is a relatively mature industry, having existed on a mass scale for just over 20 years. However, it is expected to grow at at an incredible rate, according to a couple of recently released market analyses.

 

According to a March 2018 forecast from Market Research Future (MRF), the web hosting market may be large but is expected to continue to expand. IT will achieve about $154 billion in revenue by 2022. To get to that level, it will sustain a compound annual growth rate, between 2016 and 2022, (CAGR) of 16%.

 

A report from California-based market research firm QYReports was nearly identical to the projection from MRF. The QYReports analysis suggested that web hosting will grow at a 16% CAGR through 2023. This forecast could be seen as slightly more optimistic than the other forecast, since 16% is an impressive growth rate to be sustained for an additional year.

 

This report, released in February 2018, was discussed in tech news site WhaTech.  WhaTech noted that the industry would also become more competitive during this period, since growth projections continued to be so strong.

 

Gartner clocks cloud hosting at 36.6%

 

As indicated in the market overview above, cloud hosting is on the rise – so understanding the above numbers is in part about understanding what is going on with the cloud market. Here is what is really impressive: cloud hosting, infrastructure as a service (IaaS), is growing at an incredible rate – 36.6% — from its size of about $34.7 billion in 2017.

 

These numbers come from Gartner, which estimated that sales of cloud hosting, or infrastructure as a service (IaaS), was currently growing at 23.31% and would far outpace the general growth of cloud services. Notably, cloud services was not expected to be as wildly successful as a broad category – although Gartner did expect healthy growth of 13.38% through 2020.

 

A primary reason for broader adoption of cloud services – and the stronger CAGRs that come with it – is a transition in the purchasing of business process, application, middleware, and infrastructure to the cloud model, per Gartner.

 

Data from a forecast by Synergy Research Group, released in July 2017, had similar findings. That report suggested that public infrastructure as a service (IaaS) and platform as a service (PaaS) would be 29% annually. Synergy suggested that managed or hosted private cloud would expand at 26% over the same period. Another interesting insight from the Synergy report was that the market for software and hardware to supply enterprise data centers would keep shrinking as public cloud was continually adopted in place of privately controlled IT infrastructure.

 

Cloud hosting for your growing business

 

Web hosting has been growing. Businesses have increasingly seen the need to expand their online presence, and they have increasingly entrusted outside systems and specialists to further that growth. Cloud hosting is one area that is growing at an incredibly fast rate. The fact is, all cloud systems are not built alike. At Total Server Solutions, we do it right: our cloud hosting boasts the highest levels of performance in the industry. Your cloud starts here.

IoT Will Propel Hosting Industry

Posted by & filed under List Posts.

We have written recently on the cloud-IoT connection and on how cloud hosting is central to the current growth of the hosting industry. Given those two precursors, it is only natural to write this third piece on the direct impact of the IoT on the web hosting market.

 

Boom of IoT fueling hosting industry: long time coming

 

Some may experience déjà vu, feeling that they have read this same article maybe 5 or 6 years ago. After all, the notion of a surge of connected devices has been around for some time, and researchers who saw IoT as a source of growth for data center and infrastructure services have been proven right. An example of that is Rachel Chalmers, an analyst with 451 Research; she said in 2012 that the IoT would revolutionize the data center and hosting sector, creating a deluge of data that would cause organizations to work with outside companies to assist them in management.

 

Chalmers talked about how the potential of the data center industry was growing in part because it had remained so strong during the Great Recession, continuing to grow during that period when so many other industries receded. The worldwide credit crisis and economic conditions that some said teetered on a worldwide depression were not felt within the infrastructure industry, which continued with double-digit growth over those years.

 

Chalmers specifically pointed to the Internet of Things as the main centerpiece of the future, defining the emergence of the IoT as an “inflection point” that would create much greater growth for hosting as companies began to reorganize the way they approach information. “We believe hosting and managed services providers stand to be the main beneficiaries of this trend,” she said.

 

Reports show investment in IoT networks & infrastructure

 

Recent numbers suggest that Chalmers is right.

 

An IDC forecast from December 2017 suggested that the IoT would rise at a 15% CAGR, from $674 billion to $772.5 billion, between 2017 and 2018.

 

The same analysis noted that the sector would hit $1.1 trillion by 2021, growing at a CAGR of 14% from 2017 through that point.

 

A January 2018 report from Massachusetts-based analyst BCC Research specifically looked at the expansion of the market for IoT networking services.

 

The report said that the IoT is expanding globally because of the following factors:

 

  • evolution of cloud computing and storage systems
  • dropping cost of embedded computers and sensors
  • proliferation of wearables
  • broader wireless and web access.

 

The researchers noted that there are currently more connected devices than people on the planet, and that there would be four devices per person worldwide by 2022.

 

The revenue from sale of IoT networking plans will reach $1.0 trillion by 2022, this report projected, rising at a CAGR of 21.6% over that period.

 

Another compelling report is from Chalmer’s firm, 451 Research. The 2017 analysis gathered responses from 575 worldwide IT executives and senior IT buyers who had positions largely in North America and Europe.

 

The results showed that large numbers were planning on investing in various backend elements in order to facilitate the burgeoning Internet of Things over the next year:

 

  • storage capabilities – 32.4%
  • network edge hardware – 29.4%
  • server infrastructure – 27.2%
  • external cloud infrastructure – 27.2%.

 

The report noted that companies were increasingly adopting cloud storage because it allowed them to keep costs low while staying agile.

 

IoT impact: rise of cloud hosting

 

The cloud is predicted to quadruple in the next few years, said Chris Pentago in TechCo 2017. Pentago pointed out another startling fact: the portion of workloads processed internally would dwindle to just 8% of the total, with cloud hosts and data centers processing 92% of Web traffic.

 

Pentago pointed to the Internet of Things and big data as fueling this growth, since the volume of data being produced by the IoT is so massive.

 

IoT impact: “the new oil”

 

The Internet of Things is about interconnection of all the things around us, giving cloud processing and insight to our environments. We will increasingly be able to talk with and better manage our homes, offices, and cars.

 

Again, the scope of data must be huge in order to enable our “smart” surroundings.

 

Mark Bidinger of Schneider Electric said that the IoT is fundamentally about growth of data and growth of data centers.

 

As the Internet of Things heralds the arrival of an even more connected environment, said Bidinger, businesses realize that their networks should be interoperable, secure, transparent, and flexible. This concern with networks is in large part because of the value of what it is carrying – the data that Bidinger calls the “new oil,” a resource that large analytics platforms consume.

 

An example is a metals, minerals, and mining business. Every minute, the enterprise generates 2.4 terabytes of data through its mines, railways, and ports. The data is produced to feed the firm’s preventive maintenance and predictive analytics platforms. The ultimate question is always whether or not investing in data analysis is worth it. In the case study of this firm, “[t]he return on that data investment is about $200 million per year over the next 3 years.”

 

The extent to which IoT technology could yield ROI and help companies outperform their competition is almost unbelievable. However, edge data centers are needed for data processing. Colocation providers and web hosts will have to provide strong and flexible service so that the IoT can be easily integrated and demand can be met.

 

While huge amounts of data may be needed as “food” for the analytics programs, those systems will pay off by developing insights that lead to lower expenses, streamlined efficiency, better reliability, and greater speed.

 

IOT impact: rise in data lakes

 

People will be deploying more data lakes within the era of the IoT. Enterprises are increasingly using theses repositories of raw data stored in its native format.

 

This trend was reported by IT executive and engineer Dean Hamilton, who noted in 2017 that data lakes were becoming commonplace with the embrace of the Internet of Things.

 

He is backed up by the numbers. A study by MarketsandMarkets found that the data lake market would grow at a 28.3% compound annual growth rate (CAGR) from 2015 to 2021, rising from $2.53 billion to $8.81 billion (in USD) over that stretch.

 

As an increasing amount of data is produced, and as more cars, appliances, and sensors start to interact seamlessly in an IoT world, it is becoming generally accepted that data should be collected at central points, where big data engines can be used to compare it to other data and otherwise assess it. Having these data lakes in place is important in order for companies to get the most value out of the data, in immediately applicable and long-term knowledge.

 

To be clear on what a data lake is, it is a management and storage platform for huge quantities of structured and unstructured data, all in its native format. It is also utilized to more quickly restore data as needed.

 

Your IoT cloud

 

Are you in need of cloud for your IoT project or data lake? At Total Server Solutions, using the fastest hardware, coupled with a far-reaching network, our cloud hosting boasts the highest levels of performance in the industry. See our cloud solutions.

The Connection Between Cloud Hosting & the IoT

Posted by & filed under List Posts.

The Internet of Things is beginning to change our lives in numerous ways. Certainly elements of greater efficiency and convenience will be introduced by the IoT; however, there is a price to pay.

 

With the IoT generating a massive volume of data, the infrastructure must somehow match pace. Companies need to have the right hardware in place, with reliable data centers to house them, if they want to alleviate their data growth pain. Cloud is helpful in terms of allowing IoT devices to work in coordination.

 

Scope of the IoT

 

Cloud hosting is needed in part due to the data demands of a fast-growing IoT. According to a market forecast from business analyst the International Data Corporation (IDC), global revenue for the IoT is predicted to rise from $674 billion in 2017 to $772.5 billion in 2018, a 15% compound annual growth rate (CAGR); the firm expects IoT spending to rise at 14% from 2017 through 2021, at which point it will have achieved a $1.1 trillion scope.

 

All this growth is an indication of an increasing number of connected devices. A Massachusetts Institute of Technology (MIT) study predicted that there would be 28 billion devices connected to the Internet of Things by 2020. Gartner researchers suggest the number will be a bit higher by that point, at 33 billion.

 

Relationship between cloud & IoT

 

The majority of new electronic devices have cloud-hosted systems that help them to function. The cloud elements are becoming more integral both to the technology and to its ecosystem. Cloud is expected.

 

Users of devices think that what they buy will be connected to the Internet. People want all aspects of their surroundings, increasingly, to be interconnected – and that integration is enabled by cloud servers. Manufacturers of these devices “understood early on that it does not make sense to keep all the smarts and storage in the device itself,” explained David Linthicum, who added that companies also wanted ways to seamlessly and immediately apply system-wide updates.

 

The same basic model – IoT devices run on cloud servers – is used for your smartphone updates and TV service, and it’s increasingly how your thermostat and car will run as well.

 

There are challenges that arise from this more connected IoT era we are entering. Security is the most obvious. While it may not be quite so troubling a thought to have your TV hacked, it is not something you want to happen to your vehicle. Security has often not been prioritized sufficiently by those who manufacture IoT devices. There are bound to be numerous security events in the field within systems in which data protection has not been at the forefront. (For instance, manufacturers have not always demanded that their data be stored in facilities that meet the American Institute of CPA’s SSAE 16 / 18 service controls standard.)

 

We are now experiencing a huge expansion in cloud services for devices; more use of compute and storage that runs the devices (i.e., cloud hosting or infrastructure as a service); and we’ll get stronger networks, such as cellular networks that are nearly as fast as home networks.

 

To get a sense of how the Internet of Things is growing, look no further than your own home Wi-Fi and its number of connected devices; there are numerous applications in business that make the same case for its expansion. The rise of the IoT cannot occur without cloud services to back it. Plus, since connected devices are becoming so much more widely used both in personal and business settings, the use of cloud systems will only further proliferate.

 

What’s the difference between cloud and the IOT?

 

Cloud computing is the technology that allows for virtual distributed computing – delivery of reliable and scalable resources unconfined by specific hardware (rather, supported by a technological structure that incorporates numerous computers). There are three models of cloud, software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). In SaaS, applications are run in an off-premises cloud system; in PaaS, the cloud contains the tools and building blocks to create cloud apps; and in IaaS, the provider supplies data center space, networking, storage, and servers – the infrastructure for a project – with full maintenance built into the relationship.

 

Those models of cloud have to do with the IT service that is being provided. Cloud is also separated into three types, which are ways of describing how isolated the environment is to one entity’s use — public, private, and hybrid. In a public cloud, software, storage, or infrastructure are provided by an independent party through the Internet. This type is the standard version of cloud. Private cloud is the same notion as an on-premises system (in which all equipment is for an individual company’s use) but with automation and virtualization that allows the environment to have cloud characteristics. Private clouds can be hosted in an on-premise datacenter or by a third party. Hybrid cloud is an integrated blend of private and public clouds.

 

In contrast to cloud, which is a technology with all those broad applications, the IoT has to do with Internet-connection of any objects (beyond the standard devices such as personal computers and smartphones). Examples of IoT devices are refrigerators, thermostats, automobiles, and heart monitors. Additional devices are joining the IoT all the time as the sector continues to boom.

 

While the scope and focus of the IoT and cloud are certainly distinct, they have roles to play that are intertwined in a world of ever-growing data.

 

Why cloud is essential to powering the Internet of Things

 

Cloud and the IoT have a complementary relationship that enhances efficiency, noted Andrew Meola in Business Insider. “The IoT generates massive amounts of data,” he explained, “and cloud computing provides a pathway for that data to travel to its destination.”

 

Here are some of the benefits of cloud computing from Information Age:

 

  • There are fewer operational challenges with cloud. While cloud may seem to have complexity that is sure to give rise to problems, it is in fact simpler to handle than other infrastructural approaches. The cloud operates through its own hardware via a service that has the sole concern of making the cloud work correctly and without snags, typically yielding better reliability than with an on-site server.
  • It’s less expensive to run long-term. You are able to cut your costs. You don’t need IT personnel to maintain your infrastructure, for instance.
  • It’s easier to get running short-term. It is difficult for a startup to work with the small amount of money at its disposal. Servers can be expensive, so it helps to get them instead as-a-service.
  • Cloud is more sustainable. Using cloud architecture means that you do not need as much physical hardware, since there is less underutilization of resources.
  • The security is stronger with cloud. You have better data protection with cloud than you do in a local environment. You are not going to lose your information because of a component failure or because of an extreme weather event.

 

Your IoT cloud

 

Cloud computing and the Internet of Things will work together to change the way that we handle huge volumes of data in the years ahead. Any IoT project requires strong cloud hosting so that it has the infrastructure that delivers all the next-gen promises of reliability, availability, and scalability. At Total Server Solutions, we offer the fastest, most robust cloud platform in the industry. Build your IoT cloud now.

Why SSAE Compliant Hosting

Posted by & filed under List Posts.

Accounting and information technology (IT) are very much connected; and the creation of financial statements must be founded on the principles of reliability and accuracy. After all, what good are numbers that do not reflect the real situation? Furthermore, when money is involved, the potential for fraud will naturally be high.

 

While it may seem to an outside industry that accounting is as simple as adding up the numbers, “garbage” data can easily have a presence on financial statements, according to Michael Sack Elmaleh, CPA, CVA (certified public accountant / certified valuation analyst).

 

The reason that financial records can often be poor is that it is not easy to see inaccuracies just by looking at reports. It could be that nothing seems problematic about the figures but that some of them are simply false.

 

The two primary reasons that inaccuracies arise, notes Elmeleh, are intentional deception and lack of proper accounting skills. Both of these issues can be addressed in the same manner, via the following two methods:

 

  • Partner with an outside accountant to perform regular audits of financial reports. The third-party accountant can check to make sure that the numbers are aligned with generally accepted accounting principles (GAAP) throughout the statements.
  • Establish sufficient controls within the organization. The controls are policies and procedures that are used to protect against fraud; make certain that statements are accurate; and properly protect all data and systems.

 

Both of these practices should be in place at organizations. Audits should be conducted regularly, and strong controls should be in place and monitored for their consistent application.

 

The Statement on Standards for Attestation Engagements becomes immediately relevant in the context of the above two practices. SSAE 16 (which has now been recodified as SSAE 18) was actually called “Reporting on Controls at a Service Organization; while SSAE 18, issued in April 2016, is called “Attestation Standards: Clarification and Recodification.”

 

” It is essentially a set of instructions or standards to be used by auditors when they are creating reports on internal controls relevant to creation of financial statements. As such, the SSAE 16 or SSAE 18 process both brings in an outside entity to verify that appropriate controls are in place (the audit, #1 above), and recommends any controls that are not present and should be (application of controls, #2 above).

 

What is the AICPA?

 

To understand the Statement on Standards for Attestation Engagements is to understand the American Institute of CPAs (AICPA), the organization that created and develops the standard.

 

The AICPA was founded in 1887. It is the largest professional association for accountants, with 143 nations and over 418,000 members represented.

 

The group develops ethical guidelines for CPAs, as well as auditing protocols to be followed by public agencies, private firms, and nonprofits. The association develops, maintains, and scores the Uniform CPA Examination, which must be passed in order to become a CPA.

 

SSAE compliance as critical to IT

 

Actually, a prominent accountant and compliance specialist, Chris Schellman of BrightLine (now Schellman), wrote a great piece a few years ago on SSAE 16 and why it is a specifically important standard for data centers.

 

Schellman explained that the standard simply should be in place in order for a facility to be treating its customers’ data with respect. The compliance standard was created, after all (according to the AICPA), to study the controls that are established at providers that offer services to customers “when those controls are likely to be relevant to user entities’ internal control over financial reporting.”

 

In other words, basically any business that sells services (as opposed to products) online, such as a hosting service, should have a SSAE 16 audit performed.

 

Schellman explained that managed service providers, colocation facilities, and data centers that operate computing systems containing data relevant to financial statements must place reasonable controls within the system – for environmental and physical security. (It is necessary for the company to properly protect its hardware from theft or damage, for instance.) Since a responsible organization places best-in-class standardized controls on its information technology, it is only natural that a data center should have this form of compliance.

 

A data center may think that SSAE 16 or SSAE 18 is not important to them. The fact is that the American Institute of CPAs has a compelling position as a professional association of certified public accountants – an association that claims integrity as one of its values, with the statement, “We are committed to upholding the highest ethical standards to maintain trust and credibility with colleagues, members and the public.”

 

Not all are sold on SSAE standards

 

There are some in IT who are less convinced of the across-the-board value of SSAE certification or compliance. It is seen as too baseline a set of standards. It is understood as a marketing gimmick. “Determining if SSAE certification or auditing is right for your data center depends on your clientele and whether you want to expand that clientele by demonstrating certain safeguards for customers,” noted Jeff Clark in Data Center Journal.

 

Clark added that there is very little easily digestible information online about various forms of regulatory or third-party industry compliance. It’s very dense. Well, compliance and standards are necessarily dense. It’s technical writing in a style, meter, and tone that is similar to law. Clark hinted toward the complexity of assessing organizations for compliance: one of the most obvious ways to decide if you care about this credential or body of knowledge is to go directly to the standard’s developer: “[C]onsult the AIPCA (American Institute of Certified Public Accountants) Statements on Standards for Attestation Engagements—already a mouthful,” he said, “to find the relevant section on SSAE16: Reporting on Controls at a Service Organization.”

 

In defense of SSAE compliance

 

Since Clark gave the standard a bit of a beating, it is worth noting that the length of the nomenclature or terminology surrounding a body of knowledge is not necessarily a good indicator of whether the set of guidelines within it have value, integrity, veracity, etc. In fact, seeing Statement on Standards of Attestation Engagements is comforting to some, since it is dryly descriptive and logical, which is what many want from compliance mechanisms based on libraries of collaboratively understood generally accepted accounting practices (GAAP).

 

The SSAE standard is a rigorously created set of controls that allows accountants to be able to say that they believe financial data is safe. Then again, maybe we are biased, since we are SSAE 16 certified and SSAE 18 / AU 324 compliant; with HIPAA & HITECH / GDPR / FISMA / PCI compliant systems available, based on extensive engineering experience. See our data security commitment.

14 Reasons to Get a Better Hosting Service – Part 2

Posted by & filed under List Posts.

<<< Go to Part 1  

 

#13 – You get “Internal Server Error” messages.

 

An internal server error, as its name suggests, involves the server getting stuck when it tries to answer your query, arriving at an impasse. It’s error code is 500 – which is actually a more technical identifier of this problem. You could see other language with the number such as “Error,” “Temporary Error,” or “HTTP Error.” It can occur in a WordPress environment or on any other type of site. Essentially, the server does not know what to do. Because of the problem within the server, it is forced to send you an error message rather than what you requested.

 

If you are experiencing this error on a regular basis, you may need a new provider. First speak with your host to see if they can resolve the issue. However, if the problem is that the provider does not know how to properly configure its servers for your requirements, you either need to order managed services for more specialized configuration needs or to go elsewhere (which can be debated once you know how they can potentially solve the issue).

 

#14 – You outgrow the host and its bargain-basement prices.

 

It is a good thing for your site to grow. It is important to know, though, that growth will inevitably mean you need to reexamine your hosting to see if it can still meet your requirements. Shared hosting, for instance, is a scenario in which you and many other companies are all using the resources of a single physical server. It is what you have if you have chosen the least expensive hosting option. As we know, the most affordable service is not always best – and like any other field, hosting plans are not created equal. You will be able to manage how many resources you use via caching and via a content delivery network (CDN), as indicated by WPBeginner. However, there are three ways this situation can go wrong:

 

  • Your performance suffers. If other sites are consuming the resources quickly during a rush of activity, your site will exhibit poor performance.
  • Your access to resources is terminated. If you start excessively using resources on a shared server, you will be the one causing the problem in #1. At that point, many hosts will stop feeding you resources immediately to stabilize the server environment and stop hurting all the other accounts. Imagine how frustrating that would be if you were just starting a huge surge following press or a product release.
  • You get hacked. The security of shared hosting accounts, compared to other options, is awful.

 

3 basic security issues with shared hosting

 

You may know that shared hosting is not great from a security perspective. What are the core issues that can arise, though? Security training site InfoSec Institute lists three basic problems with the security of shared hosting:

 

  • All it takes is for one of the sites on the server to get hacked. Once that intrusion is successful, the attackers are in the door and can access your site as well.
  • An attacker does not even need to unlawfully access the server. Since the hosting packages are inexpensive, a nefarious party could purchase a plan and access your site within the server.
  • You cannot protect yourself properly – impossible since you are unable to harden (security-optimize) a shared server. You cannot get into the Apache (web server) and PHP (coding language) configurations, through which you could improve your defenses.

 

Hosting to pick up speed as cloud & IoT soar

 

The web hosting industry is relatively mature but is currently growing at 7.1% annually, according to a June 2017 report from business intelligence firm IBISWorld. The Internet of Things (IoT) is expected to give rise to a huge boost in hosting, as indicated by a report from Market Research Future that expects the general web hosting industry to grow at 16% through 2022, to achieve $152 billion (USD).

 

Certain segments of hosting are growing even faster than that, though. For example, cloud hosting is expanding at a somewhat incredible rate – and at a substantially faster rate than cloud services generally, according to an October 2017 report Gartner report. While public cloud services (platform as a service, or PAAS; software as a service, or SaaS; and infrastructure as a service, or IAAS) are together predicted to rise at an 18.5% compound annual growth rate (CAGR), IaaS – which is simply another name for cloud hosting – is projected to reach $34.7 billion (USD) in 2017, growing at a 36.6% CAGR.

 

Why people choose cloud hosting

 

Since shared hosting is so problematic but dedicated servers do not make sense for many projects from a cost and convenience perspective, many organizations are turning to cloud. A primary reason is that, unlike shared hosting, it is highly secure.

 

Yes, setting up a cloud environment requires an assessment of the protections it has in place because every provider is different. However, entrusting your infrastructure to a third-party provider that has full-time security professionals managing data at all times is a plus. “The truth is that the public cloud is more secure than the typical data center,” noted David Linthicum in InfoWorld, “and IT would get better security if it got past its prejudice against the cloud.”

 

Here are a number of other reasons why cloud is preferred by business, according to Business Queensland, an Australian public agency:

 

  • Lower cost
  • Improved scalability
  • Business continuity (having your data backed up at a distant location)
  • More efficient collaboration through cloud-based apps
  • More flexible work arrangements and locations (since you can access anywhere)
  • Updates applied automatically by the cloud provider (the core hosting apps, and others with managed services), improving security.

 

How to recognize a great hosting company

 

One of the best things you can do to avoid poor hosting is to see the perspective of people who have been using the service. For that reason, a great hosting company should have strong ratings and reviews. You should see a strong rating on places such as Facebook – where you should also see strong reviews.

 

Mike White has been a customer of ours for more than six years. He is a web developer in Louisville, Kentucky – the head web developer at Jandango Web Solutions, which focuses on e-commerce websites. He noted in a Facebook review that, if a client site has any sort of problem, “I immediately test it on a TSS server.” He added, “If you want great service, extremely knowledgeable techs, and a successful site, you need to host with these guys.” We have numerous other positive Facebook reviews.

 

We have a long history of positive reviews, as indicated by discussion in Web Hosting Talk that is now unavailable but was cached by the Internet Archive. One customer, going by the username tdenator, noted that he had been with us for a month and that he was getting back responses to tickets very rapidly. “I can’t say enough about TSS,” he concluded. “They are fast and thorough.” Another customer, no1uknow, said that he had been using us to back systems for small businesses and enterprises. Based on that experience, he said, “I will never trust anyone else with my servers.”

 

There are many reasons to leave a bad host and the same number of reasons to choose a great one. You want a hosting service that goes beyond your expectations. At Total Server Solutions, our service is what sets us apart, and it’s our people that make our service great. See our testimonials.

14 Reasons Your Hosting Service No Longer Deserves Your Business

Posted by & filed under List Posts.

As an industry, hosting is entering an era of faster expansion. One of the primary areas is cloud hosting. The Internet of Things (IoT), which typically utilizes cloud hosting, is the clearest core reason behind cloud growth forecasts.

 

It is easy to get frustrated with a hosting service. Businesses have only become increasingly dependent on the reliability and support of their online presence, and of the systems that back their internal and external functions. The hosting industry is a highly competitive market. In this climate, you have the right to excellent customer service – with fast action to solve your problems. In other words, your support should be conducted very quickly, without having to wait around for answers.

 

Regardless of consolidation that is happening within the hosting industry, there is still a wide variety of hosting services from which to choose. When your host is not impressive in its support or otherwise, it is time to switch to one that treats you with respect.

 

That said, no one wants to get hasty and make a mistake in what is certainly a complicated and important decision. After all, you will need this organization to have a strong service level agreement (SLA) and to deliver on it – and for its support staff to be highly available and highly competent.

 

This two-part series covers the following topics:

 

  • 14 ways bad hosting hurts your business
  • Cloud & IoT will cause web hosting to soar
  • How to recognize a great hosting company

 

14 ways bad hosting hurts your business

 

Here are a few ways that web hosting can be negative for your organization, as summarized by WPBeginner and entrepreneurship writer Debra Carpenter:

 

#1 – They hurt your visibility. A poor web host will hurt your search ranking. How? Well, page speed is a ranking factor. In other words, your search rankings will be impacted by the performance of your server; if a web host does not deliver a high level of service (basically keep the system up and running strong), that effectively works against the search engine optimization (SEO) that is so critical to your exposure.

 

#2 – You could suffer downtime. If you have a poor hosting company, you could experience a great amount of downtime. Your site could experience substantial latency (the gap between the request for a data transfer and the time that it is sent). If your host is either slow (i.e. you, and your customers, experience high latency) or has that high downtime (i.e., has poor uptime), it is time to switch to a different company.

 

#3 – You lose traffic. The average amount of time a person would wait for a page to load before abandoning it was 6-10 seconds, according to research from behavioral analytics firm Kissmetrics. In other words, your potential customers do not typically have very much patience. You must get them what they need quickly.

 

#4 – Customers become unsure about buying. If your infrastructure is not serving your site well (as would be the case with poor hosting, since you are basically using their physical data center equipment as your infrastructure), many customers will be driven away. A consumer will become uncomfortable with your site if it is unresponsive, especially if they are not too familiar with your brand.

 

#5 – Spikes in traffic are wasted. You might have a huge surge in traffic at a certain point, such as when you are offering a promotion, are running a series of ads, or have just posted a blog article that is getting attention. While that scenario sounds great, it can be frustrating if your hosting company is not able to maintain high performance. If the spike in traffic results in longer load times or crashes the site, you will not be able to turn that greater traffic volume into revenue.

 

#6 – It hurts your brand trust. Customers expect to be able to use your site and its tools. If anything is not functioning properly, they will often then pass judgment on the quality of your products or services. They may also think that your message, expressed through your content, is no longer credible.

 

#7 – You get bad customer service. Web hosting companies will always have some people who don’t like them, noted WPBeginner, adding that “only the [angriest] users leave web hosting reviews.” That is true to a point. Even if there are a few upset customers, you should still see great support and service described repeatedly as you look through individual customer perspectives.

 

#8 – Your company is unable to grow as fast. Failing at reliability or availability, resulting in corruption or slow load times, is a common experience for people who are with poor web hosts. When your site is unavailable or very slow, or when your systems cannot produce reliable responses, you will lose sales and newer customers who assume that the poor performance is typical for you.

 

#9 – You can end up with poor data security. Attackers could start pummeling or surgically invading your site at any point. However, a strong host will be able to stop many of them, as well as identify and mitigate them if they do enter your system. Check the security policies of the company and the extent to which the safety of your data seems to be prioritized. To know you can respond and adapt quickly, again, be certain that support is available and has a high degree of expertise. You also want to know that they back up your information sufficiently. You should feel comfortable with the SLA and its terms, as well as any other policies, if your data is ever breached.

 

#10 – Your site gets suspended. A hosting provider will generally shut down sites that breach their policies, which include the right to shut down accounts doing anything illegal. Some hosting services, though, may suspend you for actions you did not take or that you believe are legal and do not actually violate the agreement. If that is the case, you should certainly speak with another hosting provider.

 

#11 – Your site stops working as a sales tool. Your site is effectively a salesperson. Since that is the case, you want it to look right and to perform predictably. Basically, your site demonstrates the entire dynamic – not just the language but the appearance and consistency with which you approach the customer. Regardless of the design of your site and its data assets, it ultimately has to run well. If your site is, in a way, your top salesperson, then bad hosting can make your salesperson poor – not responding quickly enough to allow the customer to feel respected.

 

#12 – You keep getting “Error Establishing Database Connections” in WordPress. You may have a plugin that is not working properly, or your account has become too busy for a plan with a set limit of resources. Whatever the situation, your hosting provider’s ability and willingness to help you solve this problem should be a gauge. If they cannot or will not help you solve this issue, you need to go elsewhere.

 

>>> Check out Part 2 here

 

Taking action

 

Is your hosting service not living up to your expectations? In such a competitive market, you deserve a relationship that is founded on trust, respect, and follow-through. At Total Server Solutions, when you become our customer, you can trust that all our decisions are driven by our relentless desire to help you succeed. See our mission and philosophy.

image of data center diversity and isolation for security

Posted by & filed under List Posts.

The nonprofit Identity Theft Resource Center keeps an ongoing record of incidents involving data compromise. The information is gathered from government agency releases and articles in the press. This effort started on January 1, 2005. Between that point and February 7, 2018, the organization has logged more than 8600 breaches, with a staggering 1.07 billion records exposed. Clearly, securing a data center is a top priority.

 

Elements of a secure data center

 

Core elements that you need for data center security are:

 

Uninterruptible power supplies (UPS) – Backup generators and UPS systems allow you to keep your infrastructure up and running when you have a power outage – important (for one thing) so that you maintain the uptime listed in your service level agreement (SLA) with customers.

 

Environmental controls – Cooling is essential to data centers: if you have too much heat, your hardware will be more likely to become defective and will need to be replaced more frequently. Servers create an enormous amount of heat, so they are essentially a threat to themselves. Environmental controls keep them cool and safe. Fire suppression is another control that is needed.

 

Security systems – You want to have a wide range of security technologies and protocols. In terms of basic access controls, you want protections such as the somewhat awkwardly named mantraps (small rooms to isolate individuals at entry), multi-factor ID authentication, surveillance platforms, cage locks, and biometric systems. Here are four core aspects to include:

 

  • Surveillance: Internally, metal detectors let you know if any equipment is leaving without authorization. Externally, cameras allow you to look for unusual activity. Overall, you have a video record if there is a breach.

 

  • Security guards: Often organizations will employ full-time security guards to protect their data centers. These individuals can secure both the inside and outside of the building. An organization could allow these professionals to carry firearms (as some companies do) or not.

 

  • Single-purpose facility: A critical feature of a secure data center is that it is single-purpose rather than multipurpose. Multipurpose means that there are personnel at the facility, typically in connected offices, that are not involved in running the data center. When a data center is truly secure, it will be built in a location and designed in a manner that reflects that purpose. A purpose-built data center will be set off from roadways (in part a cushion for visibility) and have crash-proof barriers installed.

 

  • Access controls: You should have numerous access controls in place. Control mechanisms and protocols may include electronic access cards and biometric systems. Mantraps are typically part of the layout to stop tailgating (an unauthorized person getting into the building by trailing directly behind an authorized person). Scales are used to measure people and determine if their weight has changed since entry (in which case they might be attempting to steal hardware).

 

Steps to improve the security of a data center

 

Beyond implementation of the above elements, here are a few rather straightforward steps you can take to improve protection within any data center:

 

Step 1 – Phase out legacy equipment.

 

Both your security stance and your ability to deliver services efficiently will be negatively affected by the use of legacy hardware. Aging servers and networking components must be maintained for protection (through updates/patches) but often are not. While an older machine may seem worthwhile to keep because it is functional, it actually is a threat to the business because it is an exploitable attack vector – so while the system may function in isolation, it could lead to dysfunction for the entire business. As an indication of that vulnerability, Bill Kleyman noted for data center cooling firm Upsite, “I’ve seen both security incidents as well as data center failures happen because of older gear.”

 

Since aging equipment is such a major issue for the health of your infrastructure, be vigilant about understanding what hardware is currently within your data center. Check any remote facilities where you house hardware. Check for technological artifacts in your closets. Any legacy components that you do own should be fully updated; if you realize the hardware is past its prime, it should be sold, recycled, or discarded. Efficiency will be upgraded alongside your security when you take this step.

 

Step 2 – Consider best-in-class monitoring solutions an investment.

 

You will be able to integrate two key concerns, data center facility management and information technology (IT) management, with a data center infrastructure management (DCIM) system. In other words, it is critical to go beyond the computers to encompass the entire built environment – monitoring secure locations and the locks on cages and doors, for instance. This approach is very important not just for the nefarious purposes of doing damage to the system or injecting it with malware, but also to avoid theft of servers.

 

While DCIM will give you a great sense of ongoing performance (and any threats to reliability and availability), you will also be able to see if a cage was not secured after use, along with the person who entered the area most recently. Environmental monitoring (such as checking the temperature) is also essential to the health of the equipment. A DCIM solution will allow you to check all these elements. Similarly to legacy removal, you will better secure yourself while experiencing efficiency and sustainability gains.

 

Step 3 – Create your data center using your workloads as a basis.

 

You may think of a data center as a single set of resources – but it is actually helpful to think of it as a facility within which you can create smaller ecosystems for a diverse array of use cases. Isolation is key to security, and demarcating workloads from one another also allows you to treat each of them separately rather than with a single, one-size-fits-all approach.

 

For instance, you may use modular containment and other techniques to set off a system that delivers high-performance computing (HPC). You may want certain areas of the data center to be set up to handle and store critical information. Your power management may differ from one workload to another (think efficiency optimization), as may your environmental efforts. You want the equipment and monitoring to match the applicable system.

 

Step 4 – Embrace the value of auditing, testing, and reporting.

 

Testing to improve your efficiency and security can both be extraordinarily helpful and should be performed at routine intervals. Performing these tests helps ensure that you are adapting appropriately to your organization’s development – since strong data centers are continually modified to meet the needs of a growing business.

 

Thinking from the perspective of efficiency, you will be able to make tweaks as you rigorously study the data center’s performance. In the same way, and arguably more importantly, you want to be certain that data is kept safe through security monitoring. Through data center management tools, you can boost your efficiency levels over time by analyzing CFD (computational fluid dynamics), power consumption, and environmental aspects. From a security standpoint, you can test and audit, using tools under a wide umbrella including user privileges, system locations, and physical access.

 

A secure data center for your assets

 

Because securing data is so sophisticated and challenging on-premises, many organizations choose to host some or all their systems through an external provider. It is critical to be certain that these outside parties care about your data as much as you do. At Total Server Solutions, our system is audited using the highest standard in data security, SSAE 16 Type II. See our security commitment.

How to Secure Your Cloud Server

Posted by & filed under List Posts.

A few years ago, security was listed as one of the biggest reasons people might not want to entrust their data to the cloud. For good reason, companies have been careful and systematic in figuring out what information systems to use; security challenges on the Internet are by no means a new thing. Even back in June 2011, 9 in 10 US firms said that they had suffered at least one data breach within the previous year. That’s right: 90% of companies (out of 583 companies polled) said they had been successfully compromised by an outside party within the past twelve months. Almost 60% said that their firm had experienced at least two attacks within those same twelve months.

 

A wise and important focus on security was omnipresent in early discussions of cloud computing, and it continued to be a top concern in the years ahead. A survey conducted by IDG and published in August 2013, “Cloud Computing: Key Trends and Future Effects Report,” revealed that the top challenge for an effective cloud plan was security – at 66%, much higher than stability, reliability, and integration at 47%, and concerns over whether the service would deliver on organizational and compliance standards. (The poll gathered responses from 1358 people, all of them in decision-making positions and most with managerial roles within IT.)

 

Again, this concern has continued through the years. In November 2016, another IDG report came out, the 2016 IDG Cloud Computing Survey, showing that many companies still had similar concerns with cloud. That poll found that firms were moving huge swaths of their environments to the cloud, with 60% in some cloud configuration (public, private, or hybrid). (These figures were based on the responses of approximately 1000 informational technology executives.) Even though cloud was widely deployed, security was still the top concern for 41% of those polled.

 

The concern with security has resulted in somewhat of a backlash, though, from those who are now convinced that the security of cloud is preferable to what is available in traditional data centers. For instance, David Linthicum reported in 2014 that cloud was more secure than a typical business’s traditional data center. Similarly, deputy technology editor Quentin Hardy noted in the New York Times that most major data breaches in recent years have been from attacks on traditional systems. Data may effectively be safer in the cloud because there are more security precautions in place –since security is a fundamental, core concern of any company that is serious about hosting cloud servers.

 

7 steps to secure a cloud server

 

Here is a list of seven ways to secure your cloud server, standard best practices indicated recently by Simility CEO Rahul Pangam:

 

Step 1: Implement end-to-end encryption for in-transit data.

 

You want to make sure that any time you are interacting with your cloud server, you do so through secure sockets layer (SSL) protocol (TLS 1.2) so that your message is effectively locked. The termination point of the SSL certificate should be the cloud provider.

 

Step 2: Implement encryption for at-rest data.

 

Everyone thinks immediately about data that is in motion. However, data that is in one place must be protected as well. As Pangam puts it, encryption of at-rest data is “the only way you can confidently comply with privacy policies, regulatory requirements and contractual obligations for handling sensitive data.” It is certainly a best practice in an increasingly complex threat landscape.

 

You want to use the AES-256 standard whenever you store disks within the cloud. Your encryption keys actually also need to be encrypted themselves. There should, furthermore, be a system in place to rotate the master key set at routine intervals.

 

Your cloud provider will also hopefully allow field-level encryption, so that you can encrypt SSN, credit card number, CPF, and other highly sensitive fields.

 

Step 3: Conduct thorough and regular vulnerability assessments.

 

Any company that you entrust to provide you with a cloud service should have strong and carefully strategized incident-response and vulnerability practices and systems in place. One feature that you want in terms of incident response is the ability to completely automate the risk scans that look for any vulnerabilities; you are able to perform critical security audits daily, weekly, or monthly, rather than quarterly or yearly.

 

You can make a security case for vulnerability testing daily. However, within your own ecosystem, you can decide what frequency makes sense for a particular network and/or device. This testing can be set up ahead of time or run at will.

 

Step 4: Set up and follow a data deletion policy.

 

You should have your system configured to automatically delete all customer data for any customers that are beyond the retention window that is listed within their user agreement.

 

Step 5: Focus on user-level security for better protection.

 

You want layers of security, and one way to create layers is with the user. A customer should be able to change the editing and access privileges for their information at the level of each user, and it is easy to provide this capability with role-based access control (RBAC). RBAC permits you to create delineation between tasks that is both highly granular and uses access controls as its foundation. The care that you put into setting up your RBAC system will make it easier for you to meet internal data security standards, along with compliance to any external standards such as PCI, HIPAA, or the GDPR.

 

Step 6: Get a virtual private network and cloud.

 

In traditional hosting environments, there is a dedicated server, an individual physical machine used by a single organization. A dedicated machine can be divided into either multi-tenant or virtual private servers. In the context of cloud, you want your provider to give you a cloud instance that is yours and yours alone – and to which you would have the sole right to access and control of the data. Customers connect to your datacenter. The traffic that goes back and forth to their virtual private cloud goes to their data center via an Internet Protocol security (IPsec) virtual private network (VPN), a standardized means to send encrypted data.

 

Step 7: Look for strong compliance audits and certifications.

 

The two critical third-party certifications that you want to see in your cloud provider are Payment Card Industry Data Security Standard (PCI DSS) and SSAE 16 / SSAE 18 / SOC 1 / SOC 2:

 

  • PCI: PCI DSS compliance, critical to e-commerce solutions, requires a comprehensive audit that is focused on data safeguards during transmission, processing, and storage of data. Note that PCI DSS does have a rather granular focus on payment data, specifically cardholder data, because these standards are designed and promoted by the major credit card brands – Discover, MasterCard, Visa, American Express, and JCB – through the PCI Security Standards Council. Nonetheless, the standard does have strong guidelines and thorough guidelines for highly important security techniques including application development; network design; policies and procedures; and vulnerability management.
  • AICPA: SSAE 16, SSAE 18, SOC 1, SOC 2 are related compliance standards as a name change is taking place at the American Institute of Certified Public Accountants (AICPA), which develops all of these standards. These standard are focused on the controls in place at service providers; the audits are intended to help companies find and fix any flaws in their vendor management environments, compliance management systems, and risk assessment programs. These standards demonstrate through third-party auditing that a cloud provider has an infrastructure and set of policies in place that meet strong stipulations, as established by an accounting professional organization.

 

Launching your cloud server

 

Do you need a cloud server that you are confident will be fully protected by your infrastructure provider? At Total Server Solutions, our SSAE 16 Type 2 Audit is your assurance that we follow the best practices to keep our data center up and running strong. See our security commitment.