Why You Should and Shouldn't Use Colo

Posted by & filed under List Posts.

What if you could somehow pass on your server room responsibilities to someone else? How would it feel to get access to the network power, performance, and staff of a huge enterprise? If you replied, “That would be awesome,” to either of the above questions, colocation may be the right choice for you. However, there are pros and cons to this approach. Let’s first explore what this IT approach is and an overview of the current market before looking at key elements within the industry (to better understand what is impacting providers), and a list of pros and cons for taking this route.

 

To colocate, or not to colocate?

 

Colocation is leasing space in an outside data center for your servers and storage – with the owner of the facility meeting your needs for a secure physical location and internet connection. Unlike with cloud hosting, all of the hardware in a colo relationship is owned by you. This arrangement is attractive to many companies because of basic economies of scale: you can access a highly skilled staff, improve your bandwidth, bolster your data safety, and access more sophisticated infrastructure. Your bill is basically a combination of rack space and some degree of maintenance (often minimal).

 

Where is colocation in 2017?

 

Colocation providers are entering a trickier landscape as the market gets hotter: buyer personas are proliferating; sustainability is becoming a greater concern; cloud hosting is on the rise; and computing strategies are becoming increasingly diversified and complex. Just how hot is colocation getting? With a 14.4% CAGR between 2011 and 2016, the industry is a bit steamy. (But don’t worry: enterprise-grade, multiple redundant cooling systems ensure that your hardware will never sizzle.)

 

Related: “How to Use Colocation to Your Advantage”

 

As the market continues to develop, colocation vendors must have the agility to reshape themselves in response while also looking for ways to build their own business by incorporating breakthrough equipment and strategies, and by continuing to focus on operations, affordability, and performance.

 

Changing elements of the colocation industry

 

Here is a look at some of the key aspects of the market that are evolving, keeping life interesting for those who work at colocation providers:

 

Who buys location? In the past, people in facilities or procurement roles would typically be the ones engaging with colocation vendors. Now, though, choices on infrastructure are being handled by a broader group that includes line-of-business and C-level management. Since colocation firms are now interacting with more COOs, CFOs, and heads of business units, it is increasingly important that they are prepared, both from sales and business perspectives, to “talk shop” meaningfully with individuals from a multifarious array of backgrounds.

 

How is DCIM used? Both internally and as a value-added service, data center infrastructure management (DCIM) software is becoming a more central function in colocation facilities. DCIM bolsters service assurance while leading to better consistency across analytics. It allows companies to convert their data into actionable metrics and gives infrastructure executives insight into speed and reliability throughout the scope of systems, for more accurate, knowledge-driven decisions. These gains lead to a less expensive, more highly available, and more efficient ecosystem.

 

How is the data center designed? The way that a data center is laid out must make way for cloud hosting, edge computing, and other growing methods. Because methods are in rapid flux, adaptability must be built into architecture. Flexibility makes it possible to pivot to meet different applications and needs. On the flip-side, what colocation centers do not want is minimal service options or stranded capacity. Addressing these issues requires a sustained focus on density and the support of mixed-density rows. Right-sizing can be achieved through modular design so that colocation firms do not overprovision from the outset. These vendors must think about the extent of resiliency that they want to implement and how far to go in that direction – keeping in mind that high resiliency, like high density, is expensive. Additionally, safety must be considered as an element of design, especially since higher density, in and of itself, poses a greater risk to staff.

 

What is the upside of colocation?

 

It is, of course, a little awkward for a colocation provider to discuss the “cons” of colocation; however, our general infrastructure and managed services scope allows us to view this approach from a bit more consultative perspective.

 

When it comes down to it, there is an immediate and obvious disadvantage of colocation, depending on your perspective: control. Anyone who chooses this route knows they are handing their servers over to someone else.

 

Well, so then why do people do it? For one thing, yes, you lose control of your servers in a physical sense, but you do retain much more control over them than in many hosting scenarios (most notably cloud, since that option is often juxtaposed with colocation).

 

Beyond that, reasons vary. Small and midsize businesses can use it to affordably access a more sophisticated computing environment than they have onsite. Another key, organization-nonspecific reason that colocation is used comes from Michael Kassner of TechRepublic: “[M]ost managers said their colocated equipment was mission critical, and the colocation providers were able to meet their requirements at a lower cost than if the service was kept in-house.” Sounds simple enough.

 

Here are a few additional ideas from Susan Adams of Spiceworks on the advantages of entrusting your servers to a colocation facility:

 

  • Improved physical security (think access logs, cage locks, and cameras)
  • Helpful support (well, if you’ve chosen the right provider)
  • Better uptime, since you’re getting access to cutting-edge uninterruptible power supply (UPS)
  • Better cooling so that your hardware gets better care
  • Scalability, since all you have to do is send the data center more machines
  • Connections with various major ISPs through dedicated fiber.

 

Colocation is often more cost-effective than using your own datacenter since the amount you get billed is inclusive of HVAC costs and power. “Even without those costs savings, though,” says Adams, “you’re paying for the life-improving peace of mind of an enterprise-quality, stable, and fast data center.”

 

What is the downside of colocation?

 

From scratch, colocation can come with a somewhat sizable price tag simply because you need to start with your own software and servers. Once you have all your machines ready, you want to watch your bill in a colocation setting. To keep your costs contained, generally be careful about exceeding your plan’s maximum bandwidth, says Adams.

 

Also, switching from your own datacenter to colo can be more complex and time-consuming than you might project. Build in extra time, and be prepared for potential snags. If you want to be able to get to your servers physically, it is necessary that the colocation center is nearby (and you can blame the space-time continuum for that one). Also along the lines of access, Adams recommends a walk-through prior to signing a contract to verify that security protocols are solid and any other claims are legit.

 

Finally, in order to facilitate usability, you want to have a strong connection to the colocation facility – as can be achieved with a Border Gateway Protocol (BGP) circuit and BGP tail.

 

*****

 

Are you considering colocation for your infrastructure? At Total Server Solutions, all of our datacenters are robust, reliable, and ready to meet your challenges. Discover our reach.

How to Use Colocation to Your Advantage

Posted by & filed under List Posts.

 

Let’s look at the colocation market and a few statistics; talk about why businesses are choosing colocation (i.e., the problems it addresses); and finally, review 10 strategies to select the best colocation provider.

 

What does the move off-premises look like?

 

The amount of computing workloads that are handled onsite has hovered at approximately 70% for the last few years, but research suggests cloud and colocation will be responsible for a greater share in the years ahead.

 

According to the Uptime Institute’s 2016 Data Center Industry Survey, fully half of IT decision-makers predict that most of their computing will eventually occur through a third-party facility. Among those, more than two-thirds (70%) say that they expect off-premise to outdo on-premise by 2020.

 

A substantial portion of the transition to external providers is headed for public cloud. However, many businesses will also be switching over to colocation, or colo – the rental of space within an external data center for a business’s own servers and hardware. This practice is called “colocation” (co-) because it collaboratively meets the business needs of the client: you provide the servers and storage, while the vendor provides the facility, physical security, climate control, bandwidth, and power.

 

Colocation vendors have been expanding. That’s evident with statistics from business intelligence firm IBISWorld that reveal a compound annual growth rate of 14.4% from 2011 to 2016 (with a total size of $14 billion).

 

Why do businesses choose colocation?

 

Here are some of the most common reasons businesses use colocation, according to senior IT executives:

 

  • Worldwide growth
  • Challenges related to mergers or acquisitions
  • Migration of systems that are not core business
  • Leadership instructions to move off internal hardware
  • Save the cost of building a new data center
  • Limit churn from noncritical computing into critical systems
  • Use of a different power grid for disaster recovery
  • Unsureness about in-house resources or staff.

 

Michael Kassner of TechRepublic lists several other reasons for this practice that get a little more granular:

 

  • Cost-effectiveness – Because data centers can get volume deals on internet access and bandwidth, you can save on those costs.
  • Security – If an organization does not have an IT staff that has some security expertise, the colocation facility is better for data safety.
  • Redundancy – The amount of backup is expanded in terms of both power and the network. A business might have its own generators in the case of outages for uninterruptible power, but they often will not have diversified their internet connections with various vendors.
  • Simplicity – You own the software and hardware, so the businesses are able to update these components as needed without having to renegotiate with the vendor.
  • No more “noisy neighbors” – If you don’t have guaranteed resources in a VPS or cloud hosting plan, you can end up with other tenants hogging the resources (CPU, disk I/O, bandwidth, etc.), hurting your performance.

 

10 tips to select a strong colocation vendor

 

Any company that is using colocation is using some of its budget for data center capacity from an external party. Since that’s the case, they are entitled to expect that their vendors operate with at least as high of standards as they apply in-house. The brokering of services generally has become a more important skill for CIOs; as for colocation, the assessment, contract structuring, and management of these partnerships will become increasingly critical to the success of an IT department.

 

Here are tactics to make sure colocation works right for you (and you’ll notice that many of these direct questions cover similar ground to the above listed general reasons):

 

#1. Prioritize physical location. Yes, you want to be able to get to the facility easily for physical access; plus, be aware that data replication and network latency will be simplified and improved by relative proximity.

 

#2. Confirm third-party verification. You need to know that availability is fundamental to the infrastructure that you’re using. Make sure there is documentation to back up any claims made by the vendor about their ability to meet Statement on Standards for Attestation Engagements No. 16 (SSAE 16) or other key industry standards. If your systems are mission-critical, get evidence from the provider.

 

#3. Check for redundant connectivity. Note that redundancy is a key reason why colocation is a strong option. You want to make sure of the existence of connectivity backups. Reliability of these internet connections is also crucial.

 

#4. Look for commitments to security & compliance. Security should be a major concern of any data center, but verifying that commitment is a major concern for you. You also have to check that the vendor meets your regulatory requirements so you are protected and aren’t blindsided by violations.

 

#5. Review how the vendor will provide support. You need to make sure your needs are met both in terms of the hardware and support, as should be spelled out in the service-level agreement (SLA).

 

#6. Get a sense of business stability. Matt Stansberry of the Uptime Institute advises looking for a colocation facility that has been running for a number of years, by the same organization, with a consistent group of providers and clients. In other words, you do not want moving pieces but stability. Problems are likelier to arise when the vendor you choose gets acquired by another organization, reinstalls hardware, adjusts its operations, or is consolidating equipment. To gauge this aspect of the business, ask about the data center’s hardware lifespan, occupancy rate, and even employee turnover. Does the average staff member have a long tenure? Why not? And if the hardware is aging, do not be surprised if the firm is gearing up for potentially problematic upgrades.

 

#7. Assess the scope of services offered. Ideally, the vendor will provide a range of services. It may sound irrelevant to your specific and immediate concerns of getting your equipment colocated. However, diversity of offerings means that you can adjust if your organization’s needs change without having to go through the process of vetting a new provider again.

 

#8. Make sure that cooling and power are guaranteed. The SLA should ensure that power and backup power will be in place without exception.

 

#9. Confirm that operations are aligned with your expectations. You are likeliest to experience downtime when errors or oversights are made in operations. You will not always be able to get full paperwork (maintenance records, incident reports, commissioning reports, etc.), but getting what you can will give you a more transparent window into how things run at the vendor.

 

#10. Generally improve your RFPs and SLAs. Make sure terms are established well within an RFP or SLA. Specific ideas to enhance your effectiveness with these documents from the Uptime Institute Network include: 1.) staying brief (2-3 pages) so that potential vendors don’t feel overwhelmed by a massive document; 2.) remembering that due diligence must occur regardless what brands are currently using the vendor; and, 3.) keeping overprovisioning at bay by questioning hardware faceplate data and assuming excessive impact from an equipment refresh.

 

*****

 

Are you looking to make the most of colocation as a strategy for IT at your business? The above considerations can guide you in the right direction. At Total Server Solutions, we meet the parameters of an SSAE-16 Type II audit; but our service is what sets us apart, and it’s our people that make our service great. Download Our Corporate Overview.

Get Started with the Internet of Things

Posted by & filed under List Posts.

Strategizing a conscientious plan will help you launch into the internet of things without any hitches along the way. Here, we look at three methods or best practices that seem to be held in common by the most successful IoT adopters, as indicated by an MIT overview. First, though, we assess statistics on the scope of the IoT and its general business adoption rate.

 

Is the internet of things on the rise? Well, considering recent IoT market statistics, the answer is a confident “yes”:

 

  • The total market size of the IoT will increase from $900 million to $3.7 billion between 2015 and 2020 (McKinsey).
  • The number of devices that make up the IoT will expand from an installed base of 15.4 billion to 30.7 billion by 2020, and on to 75.4 billion by 2025 (IHS).
  • IoT hardware, software, and business service providers will have annual earnings greater than $470 billion by 2020 (Bain).
  • Over the next 15 years, the total money that will be injected into the industrial IoT will be more than $60 trillion (General Electric).

 

Despite these numbers, and even though the internet of things is generally a subject of widespread attention, many companies have still not launched an IoT project. A report from the MIT Sloan Management Review published just nine months ago revealed that the majority of companies responding to their international survey (3 in 5) did not currently have an IoT project in place.

 

However, as Stephanie Jernigan and Sam Ransbotham note in the journal, the flipside is that 2 out of every 5 organizations are moving forward with IoT. The important thing, then, is to figure out what can be learned from the early adopters.

 

How do you move forward with successful IoT?

 

Here are the three best practices that seem to differentiate the most strongly successful adopters of the internet of things from the ones who didn’t fare as well, according to the researchers:

 

#1 best practice – Think big, but act small.

 

When businesses succeed with their first attempts at the IoT, they don’t get too grandiose with its scale. They select a direction that does not stretch the budget and does not employ excessive devices. A key project mentioned by the researchers is the Array of Things (AoT), a network of sensor-containing boxes currently being installed throughout Chicago to gather and analyze real-time data from the infrastructure, environment, and movement for public and research applications. AoT “will essentially serve as a ‘fitness tracker’ for the city,” notes the project’s FAQ page, “measuring factors that impact livability in Chicago such as climate, air quality and noise.”

 

Reliability is essential because maintenance is a particular challenge of IoT projects such as this. The MIT research team notes that the AoT has been moving slowly with the launch specifically because they need to make sure they know exactly what the reliability of nodes is. According to the University of Chicago, the first 50 of a total 500 nodes were installed in August and September 2016. The project continues to work in stages through its completion, with all nodes set to be in place by December 2018.

 

There is another side to size with IoT too. You don’t just have to take care of the devices but the interpersonal connections that are impacted through these means. Companies studied by the MIT researchers typically focused on a single group or smaller group of people (rather than all the company’s points of connection), making the project easier to control from a relationship perspective.

 

A benefit of starting small and more niche is that you are less likely to create a headache for yourself in terms of integration moving forward.

 

#2 best practice – Embrace both short-term and long-term vision.

 

Jernigan and Ransbotham advised first coming up with use cases that might be worthwhile for your firm and then calculating the ROI from each of them. To a great extent, you should be able to come up with numbers associated with the project. Executives that replied to the MIT poll said that they had been able to come up with specific numbers showing the advantage of IoT via:

 

  • Rise in earnings (23%)
  • Rise in supply chain delivery or accuracy (20%)
  • Drop in fraud or other crime (16%)
  • Rise in harvest or manufacturing yields (15%)

 

The respondents said that these were each reliable ways to gauge effectiveness.

 

However, it is not enough to simply think in terms of what’s happening right now. When you move forward with the internet of things, it’s important to think about how the insight from the current project can be reintroduced to something more expansive. The MIT scholars note that some enterprises have started out collaborating on the Array of Things before jumping into other ventures.

 

Once you have your own internal project going, you will quickly think of other applications, says Silicon Labs IoT products senior VP Daniel Cooley – how you can put the data from the devices to the best possible use. “[S]omeone puts this wireless technology in place for a reason[,] and then they find different things to do with that data,” he says. “They very quickly become data stars.”

 

#3 best practice – Keep looking at different options.

 

It is key that you are able to see an obvious ROI from your internet of things project, that the data is needed, and that you are gathering it by the best possible means.

 

Nearly two-thirds of those surveyed by MIT (64%) said that they could not get the results that they have achieved with the IoT in any other way. The reason that the Array of Things took form is that the Urban Center for Computation and Data wanted to be able to answer questions about city concerns through data. Realizing that they did not have all the information they needed, they had to think about their options.

 

For instance, the UrbanCCD wanted to analyze asthma rates to see how it related to traffic and congestion levels in certain neighborhoods. Leadership at the organization started to think that sensors, connected to the web and distributed throughout the streets of Chicago, would be the ideal way to get reliable information directly from the source. Jernigan and Ransbotham noted that the scientists at the center did not immediately gravitate toward the IoT. Instead, they had a problem, and setting up IoT sensors was the most reasonable fix.

 

The MIT team highlights a number of other key findings about the internet of things:

 

  • Companies with advanced analytics skills have more than triple the chance of deriving value from the internet of things than firms with less developed skills in that arena.
  • The IoT ties together devices, but companies as well. This fact “necessitate[es] managerial attention to the resulting relationships,” say Jernigan and Ransbotham, “not just technical attention to the devices themselves.”
  • The IoT ties firms to government agencies and other industry players in addition to their customers and vendors.
  • Generally, a large economy of scale is a good thing. That’s not the case with the internet of things, though. It’s often possible that expenses grow faster than the network of devices.
  • The internet of things is based on sophisticated bases, including its technical infrastructure and analytics; and it amplifies these complexities.
  • The advantage of the complexity is that those who thrive on contemplating different processes and systems are awarded.

 

*****

 

Do you want to experiment with the internet of things? Note the emphasis on technical infrastructure as a foundation for an enterprise-grade internet of things project. At Total Server Solutions, our High Performance Cloud Platform uses the fastest hardware, coupled with a far-reaching network. Build your IoT project.

Major Cloud Developments in 2017

Posted by & filed under List Posts.

Over the past few years, cloud computing has become an increasingly central topic in discussions of top IT trends and concerns. The technology was one of the main points of focus of the Forrester report “2017 Predictions: Dynamics That Will Shape The Future In The Age Of The Customer.”

 

Forrester’s discussion notes that while cloud computing has been an extraordinarily powerful cross-industrial disruptor, it also is no spring chicken. Cloud, now entering its second decade, is “no longer an adjunct technology bolted onto a traditional infrastructure as a place to build a few customer-facing apps,” notes Forrester. Software, platforms, and infrastructure “as-a-service” have been refined and perfected to meet a complete range of business needs, from mission-critical backend programs to customer mobile apps.

 

2017 will usher in further expansion of the cloud. Enterprises already had adopted various clouds prior to 2017, but the commitment to a multi-cloud ecosystem will only increase this year, suggests the report. CIOs will be challenged by management and integration of various clouds between customers, employees, affiliates, and vendors.

 

Since the organization, control, and administration of these technologies will be a growing concern for IT leaders, they will turn to networking, security, and container solutions to better facilitate easy management.

 

Public Cloud Outpaces Previous Forrester Forecast by 23%

 

At the essence of the notion that a particular approach or type of system deserves your attention is that it causes disruption, as noted above. As far as the cloud goes, its disruptiveness will remain unchecked at least through 2020, according to Forrester principal analyst Dave Bartoletti in ZDNet. Bartoletti adds that this will be the first year in which enterprises really start making a seismic shift to the cloud – fueling the market like never before. The result is that the total size of the public cloud market worldwide will hit $146 billion this year, by the industry analyst’s numbers, rising from its $87 billion scope in 2015.

 

Forrester does not see cloud plateauing this year either, though. Calling the technology “the biggest disruption in the tech market in the past 15 years,” the business intelligence firm forecast last year that the amount spent on SaaS, IaaS, PaaS, and cloud business services would rise at a 22% compound annual growth rate from 2015 through 2020. The $236 billion size of the market that Forrester now estimates for 2020 is 23% larger than an earlier report from the company. Plus, the Forrester researchers now predict a more expansive transition to cloud from legacy software, platforms, and services.

 

3 Reasons Business Can’t Ignore Cloud

 

To demonstrate how critical cloud computing has become, Bartoletti offers three reasons business can’t ignore the technology:

 

  • Pay-per-use is just one way to buy cloud. 2017 will see a greater focus in the industry on shaping more diverse payment models: pre-paid, on-demand, reserved capacity, and enterprise agreements.
  • More lifting and shifting. Cloud migration will become simpler with better lift and shift functionality. With massive infrastructure-as-a-service providers offering a lift and shift approach to transfer systems into their environment, this method is becoming more prevalent.
  • Challenges remain with hybrid cloud networking. Hybrid cloud is now deployed heavily throughout industry. However, says Bartoletti, “it will take a long time for most organizations’ networks to be able to seamlessly connect into hybrid cloud management and orchestration tools.”

 

In this climate, it’s necessary to establish clear plans related to software-as-a-service and private cloud. Furthermore (and as indicated above), the management capacities for security, networking, and containers, as well as hyper-converged infrastructure, are becoming important areas of expertise for CIOs in 2017.

 

7 More Central Cloud Developments

 

That advice Bartoletti gives is what he seems to consider the most salient aspects of a broader list of major cloud trends that was constructed by Forrester called, “Predictions 2017: Customer-Obsessed Enterprises Launch Cloud’s Second Decade.” In an overview of the report, Forrester enterprise architecture analyst Charlie Dai hits some of the same points covered elsewhere with some to-do items for 2017 before outlining the developments:

 

  • Solidify your 2017 private cloud and software-as-a-service plan (and really now extending it through 2018).
  • Get informed about how your technology options are changing with innovations in containers, networking, security, and hyper-converged architecture.

 

Here is the remainder of the 10 cloud developments for 2017:

 

  • Private clouds will become more prevalent as hyper-converged infrastructure is adopted more aggressively.
  • The so-called “megaclouds” (SAP, Salesforce, Google, AWS, Microsoft, IBM, etc.) will become less dominant as growing competition and differentiation based on service and niche opens up the market.
  • Expensive, heavy, and complicated private cloud suites will become less popular.
  • Software-as-a-service will become more customized and distinct in the needs that it meets.
  • Cloud development worldwide will hinge heavily on Chinese companies.
  • Cloud management and platforms will be heavily impacted by greater use of containers.
  • Security will become a more standard tie-in with cloud service offerings.

 

Additional Forrester 2017 Predictions

 

Let’s branch out to a little broader context and look at the company in which the cloud finds itself; what other technologies were highlighted by Forrester in its 2017 Predictions? The other areas of tech that Forrester covers in “2017 Predictions: Dynamics That Will Shape The Future In The Age Of The Customer” (with the cloud subsection covered above) are artificial intelligence (AI); the internet of things (IoT, which is heavily dependent on cloud); and virtual and augmented reality:

 

  • Artificial intelligence: The report points out that users have provided an extraordinary amount of data about themselves to companies. “From a customer’s point of view,” says Forrester, “that is OK (sort of) if the company uses that same data to deliver valuable, personalized experiences.” However, data is locked in numerous environments, and companies are generally not integrating with one another for more complex insights. This year, AI integration will soar, brought about by a stronger demand for knowledge about user behavior via mobile, IoT, and wearables.
  • The Internet of Things: The internet of things is growing astronomically as companies have begun to realize its potential to drive revenue. However, the manner in which IoT is being applied is unstructured – since the industry is still taking shape. Use cases, protocols, standards, programs, and equipment are highly varied. During 2017 and leading into 2018, the IoT is becoming increasingly sophisticated. Modern microservices will underlie internet of things plans, which will extend across cloud servers, gateways, and edge devices. IoT devices are security concerns, though: they can be hacked and turned into DDoS slaves.
  • Virtual and augmented reality: Following the ascent of Pokémon Go, Forrester is doubling down with its predictions for the continuing rise of virtual and augmented reality. The analyst lists three key insights relate to VR and AR: 1.) IT costs and power will keep getting more manageable; 2.) Developers will keep working with an array of tools, leading to innovative approaches; and, 3.) Since there aren’t “best” applications or use cases in this field, the market will experience a gradual and systemic evolution that is carefully strategized.

 

*****

 

Do you want to be better prepared for the Age of the Customer? As with the other technologies highlighted by Forrester, you don’t just need a cloud platform. You need the fastest, most robust cloud platform in the industry. Your cloud starts here.

Jazzercise Rebrand with Magento

Posted by & filed under List Posts.

Jazzercise needed to change its image and upgrade its online presence. The company rebranded in 2015 and launched a new Magento site in 2016 – resulting in 20% fixed operating cost reduction and 14% higher revenue.

 

Jazzercise. Yes, it’s a household-name exercise program, but it certainly has not been the workout of choice for millennials. That’s in part because the brand has struggled to recover from an 80s image – when the fitness world saw a heyday with the rise of stars such as Jane Fonda and Richard Simmons. Probably no one is more emblematic of that erstwhile “get in shape” craze than Simmons – whose 1988 sensation Sweatin’ to the Oldies made him virtually synonymous with slimming down in a positive, self-affirming, and entertaining way.

 

This association with Simmons is problematic because he has been treated rather mercilessly in the media. The jokes of talk show hosts such as Howard Stern and David Letterman were sometimes harmless and other times cruel, as was also true with random amateurs (see this YouTube user). The manner in which Simmons was framed as a laughingstock is troubling, given how emotionally fragile he seems to be – and the fact that he has receded from TV since 2014. What does all this mean from a branding perspective? Jazzercise identified that it was stuck in the past, as part of that same corny fitness trend that made it so easy for pop cultural figures to ridicule Simmons.

 

Related: “The Right Ecommerce / Brick & Mortar Balance”

 

Jazzercise wanted to be taken seriously, and in order to do so, the brand had to come of age. Let’s look at how Jazzercise updated its brand image and how the company’s adoption of Magento helped it to recover momentum.

 

Jazzercise Rebrands

 

Jazzercise has actually been around for 48 years; now headquartered in Carlsbad, California, the company was originally launched by Judi Sheppard Missett in Evanston, Illinois. In 2015, Jazzercise began its reformulation, rebooting its logo and color palette while introducing a new ad campaign; in 2016, the brand switched to Magento for better e-commerce presentation. Formerly considered a softer, subtler exercise program, the company’s new approach incorporates movements inspired by hip-hop dance, Pilates, and even kickboxing. If this sounds like an overhaul, it is; the slogan of the campaign was actually, “You Think You Know Us But You Don’t.”

 

Group fitness classes were already a part of American culture before the popular tidal wave of exuberant hip-swiveling that ushered in the 90s. “In the ’80s is when we saw [fitness instruction] really take off, and Jazzercise was a very big part of that,” explains American Council on Exercise (ACE) senior advisor Jessica Matthews. “That’s when you started to have this identified profession.”

 

The Big Business of Dance-Inspired Workouts

 

Let’s get something straight so that it’s clear Jazzercise is not a sinking has-been: the company is valued at $100 million, and it’s currently #81 on Entrepreneur’s list of the 500 fastest-growing franchises. True, 2014 saw shrinking in the number of Jazzercise locations; but the dip in numbers was effectively corrected with the 2015 rebrand and 2016 move to Magento.

 

Jazzercise franchise units (https://www.entrepreneur.com/franchises/jazzerciseinc/282474).

 

The brand credits much of its success to something that any marketer or salesperson can appreciate: framing. The perspective that the particular take on fitness allows people to take is different from what was on the market previously. Jazzercise positions exercise as dance, and people don’t think of dance as exercise. It allows people to do something they enjoy, rather than having to push themselves through something miserable. That may not convince you to sign up for a membership, but it does help explain the popularity of the model and the essence of the company’s differentiation.

 

While dance is fun, increasingly the fitness firm has recognized the need to update both the dance moves and the songs in order to keep customers engaged. Jazzercise credits that flexibility, the updating and continual reformulation of what people experience in its classes, to its kind of incredible retention: the average customer stays for seven years. However, for initial attraction, Jazzercise now also focuses centrally on effectiveness. According to the brand, you can burn 500-600 calories in a one-hour session; and independent assessments of dancing’s impact on calories suggest that’s possible (although it might be closer to 400 calories for the average person).

 

A New E-Commerce Platform as a Springboard for Further Growth

 

Jazzercise isn’t just an in-person entity, of course. Yes, the physical franchise model is at its core; but today, Missett (still the firm’s CEO) and her team release new branded exercise clothing and accessories via the company’s e-commerce platform each month.

 

Up until 2015, Jazzercise was with Amazon Webstore, and when that segment of the tech giant was shut down, Jazzercise had to rethink its approach. One of the things that had frustrated the company about the Amazon system was that they couldn’t integrate their enterprise resource planning (ERP) system with the platform’s API. That issue caused “syncing delays that dominoed into inventory discrepancies, and fulfillment and accounting nightmares,” according to a Magento case study on Jazzercise.

 

Mobile use overcomes desktop (http://bgr.com/2016/11/02/internet-usage-desktop-vs-mobile/).

 

The company was concerned about mobile device support also – since mobile use is now greater than desktop use globally, and since smartphones and tablets are now increasingly used by e-commerce shoppers. Since the Amazon platform did not support responsive templates, that meant the company had to manage both desktop and mobile sites.

 

Jazzercise needed more thorough and agile merchandising options. The capacity of Webstore in this category was not meeting the fitness brand’s expectations, hurting revenue.

 

Jeff Uyemura, the Jazzercise digital manager, specifically points to the issue of personalization and how Magento has allowed the company to customize its approach for each user. He said the decision was made to switch because the technology “allowed us to target key customer segments more effectively and offer unique content and price points.”

 

Impact of Magento on Flexibility, Costs & Revenue

 

As indicated above, when Jazzercise switched over to Magento, they were able to integrate it with their ERP platform so that there was no longer a delay in syncing data. That near real-time processing allows inventory to be consistent throughout the ecosystem and prevents overselling.

 

It was also possible within Magento for Jazzercise to upload a custom mobile theme and use it to just have one site that would correctly populate the site on any type of device (rather than the four separate ones, B2B and B2C mobile and desktop. it had when the transfer was made). With this simplicity, Magento has allowed the brand to lower its maintenance costs and create more seamless digital brand consistency.

 

In terms of merchandising, the e-commerce platform has allowed Jazzercise to highlight certain items in each category and showcase them when a catalog launch occurs. Plus, creating a design that is personalized to the customer provides better targeting.

 

The brand’s Magento site went live in June 2016. Uyemura credits it with bringing e-commerce fixed operating costs down 20% and boosting online revenue 14%.

 

*****

 

Would you like to see a similar reduction in e-commerce costs and improvement in revenue? The Magento platform can only deliver the speed and reliability you need to impress prospects and customers if it’s backed by the right infrastructure. At Total Server Solutions, we offer high-performance Magento hosting, along with optional merchant accounts so you can sell and accept payments quickly and easily.

Ecommerce and Brick-and-Mortar

Posted by & filed under List Posts.

We know that the average shopper has needs that are met in-person, as well as ones that are met through digital channels. How can companies balance their efforts between online and offline for the best possible results?

 

Stats that prove the transition to online shopping

 

When we think of a store, the first thing that might come to mind is a physical one. We walk through the door and can browse through the aisles, picking products up and trying things on before making our decisions. Virtual reality may offer “full immersion,” but the real “full immersion” is reality itself: the product in your hand.

 

However, that value of in-person inspection comes at the cost of relative convenience, as more options have emerged online and people have grown increasingly comfortable shopping for and paying for items through their computers and mobile devices. While the landscape is changing, the position of storefront retail is changing; and yes, it is clearly on the decline. Fourth-quarter industry statistics from Investor’s Business Daily show poor numbers for all the retail groups the publication monitors; in fact, the Department Stores category is last out of all industries – ranked 197 out of 197. The good news is that this devastation to the world of B&M is aligned with an expansion of e-commerce sales – a 29% overall rise during the 2016 holidays.

 

Another way to see this trend is in comparing the Q4 2016 results to determine the “online growth edge” for a couple of major box stores (IBD):

 

Brand E-commerce Storefront Online growth edge
Target +34% -1.5% +33%
Walmart +29% +1.8% +27%

 

As e-commerce continues to become more sophisticated and better able to address consumer expectations, what is the value of a physical storefront? We know it’s not the heavy-hitter it once was. It’s not just clear from the above megabrands that are straddling the fence but from those that have gone bankrupt or are closing their stores nationwide, such as American Apparel, The Limited, Wet Seal, Aeropostale and Pacific Sunwear.

 

Is B&M sinking into oblivion?

 

There is sufficient evidence to suggest that the physical store can be viewed as similar to snail mail: it is useful to many now but will only become increasingly irrelevant, says this perspective. That’s not quite right though. In essence, the rise of digital does not signal the demise of brick-and-mortar so much as an evolution of the way that people shop and a shift in the role of stores to serving a more functional, mundane purpose as a distribution point.

 

Boston Retail Partners principal Ken Morris uses the example of Restoration Hardware to make this case. The showrooms of the brand are settings for inspiration, notes Morris. “[T]hey’re not really selling anything there,” he says. “It’s like a giant 3D real-time catalog.”

 

BRP’s vice president and practice lead, Perry Kramer, adds that the service experience needs to be treated as paramount in order to win at storefront retail in the new age. The example he gives is the Apple Store, where you can try products and get advice from salespeople who are generally considered well-trained and helpful.

 

How omnichannel goes beyond multichannel as an integrator

 

You may have heard the word omnichannel a bunch of times and perceive it as one of those annoying marketing buzzwords; but actually, omnichannel is an important business concept.

 

You can think of omnichannel as a type of multichannel or even the newer, savvier evolution of multichannel. “[A] multichannel approach to sales that seeks to provide the customer with a seamless shopping experience,” TechTarget defines omnichannel, “whether the customer is shopping online from a desktop or mobile device, by telephone or in a bricks and mortar store.”

 

What differentiates omnichannel from multichannel? In a nutshell, it’s integration. Omnichannel involves backend integration rather than just diversification of channels. Compare the above description of omnichannel to a definition of multichannel provided by Jay Acunzo in the HubSpot Blog. Acunzo defines the simpler multichannel concept as communication across various channels, both digital and otherwise. Multichannel is about marketing in many different places at the same time; omnichannel is about bringing together the insight from each approach.

 

There is another aspect of omnichannel that is evident in its name. While multichannel is about many avenues you can go (see its prefix multi-), omnichannel is about addressing every possible channel (see omni-). Omnichannel is a more thorough approach based on the idea that people now expect to be able to shop, experience your brand, and engage with you as a customer through a full range of possible means (for example, within all the various social media sites, brick-and-mortar stores, your websites, and your mobile apps).

 

To better understand how an omnichannel strategy can be leveraged by a brand, just look at what customers think should be available to them. An expectation of nearly three-quarters of shoppers (71%) is that brands will have in-store inventory data available online. Similarly, an expectation held by half of customers (50%) is that they be able to buy on the Internet and pick up items in-person (“Customer Desires vs. Retailer Capabilities: Minding the Omni-Channel Commerce Gap,” Aberdeen).

 

The final, fundamental reason why omnichannel is such a key concept for your company’s growth is that these consumers are big spenders. “Omnichannel shoppers are typically a retailer’s most valuable customers—spending over five times as much as those who only shop online,” notes a Bain & Company report. “Creating a great experience for those customers is critical, and not doing so is very risky.”

 

Forgetting the money and simply looking at omnichannel in terms of user experience, your users should be able to shop more efficiently and without having to stop and start along the way. Customer service should be as sophisticated as possible; and brands often neglect that concern, so integration of different touchpoints via omnichannel is a powerful differentiator.

 

3 brands with omnichannel to emulate

 

Here are three household-name brands with omnichannel user experiences that are noteworthy and worthy of mimicry:

 

  1. Disney – This incredibly popular family brand has embraced omnichannel with its My Disney Experience tool, which allows consumers to comprehensively plan their trips, from getting Fast Passes to the park to pre-determining dining locations. Within any Disney park, you can find attractions and wait times via the mobile app. The Magic Band program, which offers Fast Pass integration, adds further capabilities and complexities: hotel room key functionality, storage of photos with Disney characters, and food ordering.
  2. Bank of America – This brand is considered a bellwether in finance related to omnichannel. The company’s tools include the ability to deposit checks and schedule appointments both via mobile and on desktop. Additionally, customers are able to pay their monthly bills seamlessly through any device.
  3. REI – This company provides clear product data throughout its customer ecosystem, says Aaron Agius in the HubSpot Blog. “[T]hat kind of internal communication will keep customers happy, satisfied and returning back to their store again and again,” he adds.

 

*****

 

Hopefully, the above advice can help you address the need for balance between online and offline shopping at your company. Do you need help getting your e-commerce site up and running, or improving the performance of your current site? At Total Server Solutions, we support all of the top shopping cart applications and also offer merchant accounts so you can sell and accept payments quickly and easily. See our secure e-commerce solutions.

WordPress Lovers

Posted by & filed under List Posts.

  • The SEO-friendliest CMS
  • Expandable open source
  • Flexibility / customization potential
  • Time-saving
  • Peace of mind
  • Lower cost upfront and ongoing
  • Responsiveness
  • Scalability
  • Simplicity

 

As indicated in our companion piece “Calling All WordPress Haters”, 18.3 million people now use WordPress. That number of users accounts for 59% of the market, more than 8 times the second-place content management system, Joomla (6.9%).

 

The sheer volume of people using this technology probably does not alone convince you that it’s the way to go. What are a few of the most compelling arguments to use WordPress? Here are nine of those mentioned by coders, web design companies, and others who are highly familiar with the environment:

 

The SEO-friendliest CMS

 

One of the surest ways to get more visitors to your site is to improve your search rankings. Here are a few ways WordPress is standardly geared toward strong SEO that are mentioned by Nick Schäferhoff in Torque Magazine:

 

  • Although the primary language of WordPress is PHP, the software creates HTML pages that search engines can scan effortlessly.
  • You can set permalinks that include the keywords of the article or other page. The permalink will auto-generate with the title, but it can often be shortened and refined for better keyword focus.
  • The title is a key component of how your page will be understood by Google. The combination of the title and the heading tags also provide the search spider with a sense of structure.
  • Content marketing is critical to getting noticed online. Since WordPress was originally conceived as a blogging platform, it is considered a powerful tool for text or multimedia posts.
  • The CMS makes it simple to bring in images and to optimize them with descriptions, ALT tags, and other elements to further increase your search prominence.

 

All those aspects of WordPress are included in the basic installation, before you add any plugins or tweak your code.

 

Expandable open source

 

WordPress is free, open source software, and you can host it yourself – which means that you won’t have to pay anything to download or maintain it (i.e., you have access to updated releases).  The ecosystem of plugins for the CMS is vast, including more than 20,000 options (caching plugins, contact forms, social buttons, automated “related post” integration, etc.).

 

Flexibility / customization potential

 

A strength of WordPress is that it provides a cookie-cutter structure, but that model is by no means rigid. Anyone using it can easily change the basic design and functionality by switching out the themes or plugins. Elisha Terada of web design company Fresh Consulting notes that these fundamental aspects of how your site operates and is presented are not just developed by enterprises but are user-created (which doesn’t hurt when it comes to seeing things from your perspective).

 

Mikke of Mikke Goes Coding also lists customization as a key WordPress strength. He notes that the appearance of the site and the way that it is organized for the user are just as central to your success as what the site provides from a utilitarian view. Customization within the CMS is nearly open-ended if you want to work with a developer on rebuilding elements. If you want to spare that expense, there are a vast number of plugins and themes you can use to improve the visuals of your site, what’s available to users, and how it works for you.

 

You can change how the features and areas of your pages are laid out, ways it can be navigated, and (of course) the content. You will be able to make adjustments to the background, visuals, fonts, and colors.

 

For broad-sweeping changes, you can switch out the theme that you are using almost instantly. There are many thousands available, and a good portion of those are free. Businesses often decide it makes sense to use paid themes, though, since the developers are then better incentivized to provide bug fixes and security updates; support for the theme can then sometimes be included.

 

For adding a functionality to the site, you can use plugins. Example functionalities include pointing readers to related blog posts, creating a contact form pop-up, building in analytics, creating newsletter lists, adding different languages (such as Spanish), spam comment blocking, and SEO friendliness.

 

Time-saving

 

Time is at a premium in 2017. You want the tools that you use for your site to help you streamline and operate efficiently. Learning how to work with WordPress is much faster than building a site from the ground up using HTML, CSS, and JavaScript.

 

Time is one of the reasons Mikke gives for why he uses WordPress, saying that the framework allows you to accelerate earlier. “[I]nstead of starting with small steps and the very basics of programming a website,” he says, “you can take jumping leaps with WordPress and be able to launch your web page surprisingly quickly.”

 

Peace of mind

 

You often hear about WordPress hacks. Like in any environment, though, a lot of the challenge is to be conscientious: use a complex password, keep up with the latest updates, and consider using a security plugin. WPBeginner suggests Sucuri, for example.

Lower cost upfront and ongoing

All of these open source options are “free,” in a sense – but you still may want to pay an outside party to get your installation in place, optimize it, and regularly update it. When DeviousMedia compared the top three CMS options by assessing the costs of setting up, customizing, and maintaining a typical site, WordPress was better than both Joomla and Drupal. Because WordPress is so widely used and there is such a large professional community surrounding it, it’s easy to get someone to provide you with development or design services as needed.

 

Responsiveness

 

When you use WordPress, your site will be responsive so that they will be user-friendly on any device – desktop or mobile. This is highly important since mobile is becoming more central to the web all the time. Worldwide, mobile traffic increased 63 percent in 2016. Put another way, the data flowing through mobile is 18 times greater than it was just five years ago!

 

Scalability

 

There is sometimes a misconception that WordPress is a starter kit for the Internet. That’s not the case whatsoever. In fact, the system is used by news websites and other organizations that depend on driving huge amounts of traffic to their information. Mikke notes that CNN, MTV News, Fortune, TechCrunch, and Sony Music are all WP devotees.

Simplicity

We have mentioned that it is fast to get WordPress up and running. Not only is it quick, but there is not a huge barrier to entry – as indicated by WPBeginner. The CMS’s community continues to expand in part because ease-of-use is a fundamental principle.

 

The open source community is thriving. If you want additional help, you can get paid WordPress support. In terms of the actual hardware that powers your site, a managed service provider can help you with all the technical aspects so that you can focus exclusively on the front end.

 

Terada concurs with WPBeginner on this point, referencing the democratization of technology allowed by this system. A primary reason that WordPress is so approachable is that it has the usability of a word-processing program, which is because it was initially a blogging platform. It was built for people who didn’t necessarily have any tech background and just wanted to put their ideas up online.

 

“[M]ost… user-interface components are user-friendly,” says Terada, “and there are written and recorded manuals available for you to easily learn how to use WordPress.”

 

*****

 

The truth is, not everyone is quite so enthusiastic about WordPress as you might think from the above discussion. Want to hear the other side on WordPress? See “Calling All WordPress Haters.”

 

On the other hand, are you now convinced that WordPress is the way you want to go? While this CMS is an extraordinary tool, it’s key that you have a high-performance infrastructure for better user experience and SEO. As with a premium theme, you may value consistent support and expert management for your server. See our testimonials.

WordPress Haters

Posted by & filed under List Posts.

You know you need a website, and everything you hear is about how great WordPress is – how simply and seamlessly you can create a site using the platform. It’s clear from the number of sites using it – 18.3 million at last count – that it is an immensely popular brand. But the statistic that is the most staggering is WP’s market share. Here is market share for the top 5 CMS systems (W3Techs; May 29, 2017):

 

  1. WordPress — 59%
  2. Joomla — 6.9%
  3. Drupal — 4.7%
  4. Magento — 2.5%
  5. Blogger — 2.2%.

 

In other words, WordPress has more than 3.6 times as many users as its four biggest competitors combined. We all want the best tools, not just the one that first comes to mind, so the question must be asked in this context: Is WordPress as great as everyone seems to think it is? Or is it just as much a lazy, safe choice?

 

Let’s look at that second scenario, exploring the perspective of those who toss it aside. We get a good example from people who are completely familiar with the ecosystem and still choose to go another direction – such as web developer Ben Gillbanks.

 

Related: Calling All WordPress Lovers

 

When WordPress Theme Providers Forego WordPress

 

The thing that’s interesting about Gillbanks specifically is that he actually co-owns a WordPress theme site called Pro Theme Design. The avenue he and his partner, Darren, took to transition away from WordPress went through the distinction between wordpress.com and self-hosted sites using the CMS installation.

 

The site was based on WordPress until 2014, at which point Ben and his partner Darren switched out their multipage structure for a non-WP, static, single-page site that included the company’s themes available on wordpress.com. Their top priority for the business at that point was manageability and building up the brand by focusing on wordpress.com – so the company didn’t even offer self-hosted options.

 

Darren decided that the decision to move to a site that was unrelated to WordPress was liberating – so he kept with the static site even when the company began to reintroduce self-hosted themes to its catalog again.

 

The specific technology that backs the site is FlightPHP, a PHP microframework. Data is contained in text files rather than a database. It’s free of dynamic elements. Third-party services provide the analytics and contact forms.

 

What’s Wrong with WordPress?

 

You’ve surely seen plenty of argument for why moving to WordPress is a great idea. Let’s look at the top reasons to ditch WP in favor of an alternative:

 

  1. “It’s slow to respond.” – Many people actually choose WordPress because it is considered relatively fast, assuming you make a number of tweaks focused on acceleration. However, speed is one of the three main factors that was listed by Smash Company’s Lawrence Krubner when he decided to transition away from WordPress in 2017.
  2. “It’s a contained universe.” – It can be a good idea for people who are currently using WordPress to try something different simply for variety and building a new skillset. Gillbanks noted that this was a core concern for him since he felt he was stagnating as a developer when he was trapped inside the world of WordPress.

 

This reason for dropping WordPress is kind of a switcheroo on something that current users often tell themselves: that it is a strong choice, for efficiency and ease, to stay with it because it’s what they’re doing now. Instead, Ben embraces the road less traveled since the very act of changing up his approach will help him become nimbler and more capable.

 

“Doing something even a little bit different is good for the mind,” he said. “By working with a PHP Framework that I haven’t used before, by ditching databases, by integrating with third party services, I can learn.”

 

  1. “It’s a frequent hacking target.” – Another primary factor listed by Krubner was poor security: he said his site had been hacked twice.

 

When Sucuri analyzed more than 11,000 sites that were infected with malware or being used in phishing scams, they found that fully three-quarters were WordPress sites; and half of that 75% chunk were outdated.

 

Clearly, security is a broad and growing problem. 50 million Internet users have experienced warnings that a site may contain malware or that their information might otherwise be compromised (March 2016, Google). What’s particularly shocking about that figure is that it rose from 17 million since March 2015 (almost tripling in size). Phishing results in search engine blacklisting for 50,000 sites a week and malware vilifies and sidelines another 20,000.

 

Sucuri emphasizes that the data on phishing and malware only reveals the number of sites for which security issues have immediate and obvious consequences. Additional sites are unknowingly jeopardized, and their authority downgraded, for falling victim to such infections as spam SEO.

 

  1. “It’s weak and bloated.”

 

WordPress is not just WordPress usually but a combination of the core CMS platform with various plugins from outside parties. Incorporating numerous plugins within a site can help with user-friendliness, but it will race through server resources. If your site is bogged down with a bunch of plugins, your search visibility will also suffer, and users will be likelier to depart your site because of slow loading.

 

Outside the plugin ecosystem of WP, errors occur less frequently. Going another route that also includes external services (such as the site approach of Gillbanks) still requires careful vetting, though. Always make sure that any outside services are well-constructed and stable, and have backup plans if any issues arise.

 

  1. “It ain’t the only open source in town.”

 

WordPress has succeeded to a great degree because it is open source – which means that its code is constantly being improved by its savvier, more technically adept users. Well, any site that is based on open technologies can push the language that makes it come to life out to the community – as Ben did by publishing his site’s code on Github.

 

People can study the site’s code for new ideas, and they can also submit pull requests and make note of problems.

 

  1. “MySQL sucks.”

 

A chief technology used for WordPress is MySQL. The incorporation of MySQL is one of Krubner’s biggest beefs with the CMS. Who else says, in so many words, that “MySQL sucks”?

 

  • In a piece entitled simply, “Avoid MySQL,” programmer Elnur Abdurrakhimov notes that the open source relational database management system (RDBMS) is unsafe and doesn’t functionally outdo the alternatives. Elnur switched away from MySQL to PostgreSQL after discovering a bug that was not being resolved. “It’s not really important what the bug is,” he said. “It’s the mentality of MySQL developers to do buggy s— they can’t fix and then call them features.”
  • In a thorough piece on the topic of MySQL’s numerous failings, grimoire.ca covers challenges he has experienced with storage and data processing; central flaws in the way it’s designed; and what he considers poor arguments for why it’s the right choice.
  1. “It’s slow for development.”

 

Everyone thinks that WordPress is the fast and easy way to get a website going. It’s accepted almost religiously that it is a faster development tool than just about any other. Interestingly, though, Ben says that he has found he can code faster with his new, non-WordPress setup.

 

*****

Understandably, you may want to stick with WordPress because it’s a comfort zone and for positives not listed here. But clearly, there are some good reasons to consider using other options. Do you need hosting and expertise for your project transitioning off WordPress? At Total Server Solutions, we’re different. Here’s why.

Hajime Versus Mirai

Posted by & filed under List Posts.

A malware strain called Mirai is created that amasses a botnet through exploitation of unsecured Internet of Things devices. As the number of zombie devices continues to build, the people behind the malware start to use it in distributed denial of service (DDoS) attacks. Eventually, Mirai really puts itself on the map by launching an attack on security researcher Brian Krebs that measures an incredible 665 Gigabits of traffic per second. Mirai’s author open-sources its code in a hacker forum. Krebs identifies (well, suspects, with extensive evidence) Rutgers University student and DDoS protection firm owner Paras Jha as the malware’s creator.

 

Fast-forward to today: That piece by Krebs (linked above) made a lot of headlines, and Jha was questioned by the FBI; but Mirai didn’t go away. If anything, what appeared to some like an epic battle between good and evil between Krebs and Mirai was actually a small skirmish in a lengthy and developing war. Krebs wanted to unmask a person whom he believed to be responsible for the spread of the botnet, but its code had already been made publicly available. What could be done about Mirai itself? Who could step up to save the rest of the Web from the unprotected segment of the Internet of Things? Someone must have thought that the best bet was to force-secure vulnerable devices and decided that they would be the person to make it happen.

 

Is Hajime Mirai’s Archnemesis?

 

One would imagine that there would be competition among black hat hackers to create the most dominant IoT malware so that they could have as many devices as possible to use as a more effective digital weapon. However, you might not have previously considered that someone might go up against the malware with a completely opposite agenda – sharing the desire to inject code for their own different purposes. Nonetheless, that is exactly what has happened – with a general consensus in the security industry that a white hat hacker is responsible for the Hajime IoT botnet.

 

In fact, after Dan Goodin of Ars Technica noted that it took a great amount of computing knowledge to design and deploy the white hat network, he concluded that it “just may be the Internet’s most advanced IoT botnet.”

 

Hajime is designed to parallel Mirai in certain ways, so it uses the same username and password combination list. The malware infects the IoT device and then blocks four ports that are most widely used for infection. Additionally, it presents a message on the terminal of the infected device, with an encrypted signature, that says the author is “just a white hat, securing some systems.”

 

Since the goals of Mirai and Hajime are directly opposed (to enslave and to protect the devices), Tom Spring of Kaspersky Labs’ Threatpost believes that the Hajime vigilante white hat and Mirai black hats will be locked in an ongoing head-to-head rivalry for control of routers, DVRs, CCTV cameras, thermostats, etc.

 

It’s unclear at this point whom the author of Hajime is. It was first detected by Boulder-based Internet service provider Rapidity Networks in October 2016. Since then, it has grown at breakneck pace, infecting any IoT devices that are using default passwords and have open Telnet ports (i.e., the targets of Mirai).

 

Hajime and Mirai are essentially using the same means – mass self-propagation and infection of the IoT – to achieve very different objectives. Although Mirai is made up of a huge number of devices (estimated at 493,000 in October 2016), it functions as a unified tool that allows cybercriminals to hammer targets.On the other hand, Hajime does not appear to have a purposeful dark side (although intention isn’t everything – see below). Instead, it seems that the only reason it was created is to self-propagate and to seal off any unsecured Telnet ports so that they aren’t taken hostage by Mirai and used to do the bidding of malicious actors, at the expense of whatever victims they choose.

 

Symantec analysts have placed the number of Hajime-infected home routers, webcams, and other devices at 10,000. However, Rapidity Networks had previously estimated that it had spread much more wildly, spreading to 130,000-185,000 devices.

 

Hajime: The Full-Featured IoT Botnet

 

While Mirai has a stripped-down functionality, Hajime has a much more sophisticated feature set. One of the best examples is the manner in which Hajime tries username-password pairs. Mirai just tries a bunch of common possibilities; instead, Hajime parses the information on the login screen to determine what manufacturer is behind it and uses that manufacturer’s default logins. For example, Hajime attempts to attack a MikroTik router with the username “admin” and no password. The Mikrotik documentation shows that combination to be the factory-default. By minimizing incorrect password submissions, Hajime is less likely to get blacklisted or blocked from the device.

 

Plus, another major differentiator between Hajime and its blackhat botnet foes is that it is maintained in a slicker manner. It encrypts communications between nodes and utilizes a peer-to-peer network, via BitTorrent, to send updates and commands. That use of encryption and distribution give it a better defensive posture to Internet backbone companies or ISPs wanting to root it out. When Rapidity Networks found a flaw in a previous version of Hajime, the author updated it to correct the problem.

 

What Else Does Hajime Do?

 

Beyond being able to change the brute force telnet credentials it uses based on its identification of the device, here are some other Hajime capabilities:

 

  • It can infect ARRIS modems using a known remote backdoor, password-of-the-day.
  • While it is infecting, it is able to determine the platform and can sidestep the absence of download commands (wget, etc.) via the loader stub (.s).
  • Hex encoded strings are used to dynamically produce the loader stub through assembly programs that are custom-designed to fit the platform. The port number and IP address of the loader are patched in the code once the loader stub is created.
  • Hajime can determine if an infecting node is currently accessible; if it isn’t, the malware will switch to another device to download the initial code.

 

Temporary Hardening of IoT Devices

 

Hajime does not permanently protect the devices it infiltrates. Just like Mirai, when the device is rebooted, Hajime is gone, and the ports are again vulnerable to Mirai infection. Since both types of infection are short-lived, experts think that Mirai and Hajime will be competing against one another for control indefinitely.

 

There has been vigilante, white-hat malware in the past. The most obvious example in this case is Wifatch, which invaded IoT devices, changed default passwords, shut off ports, and posted warning messages.

 

The issue with any type of malware, even one that has good intentions, is that there can be collateral damage to the device. If the exploit is performed incorrectly or if a port is blocked that is in use, the true owner won’t be able to use it. The malware could infect key infrastructure and push it offline. In other words, we should be careful about thinking Hajime won’t come with a downside.

 

*****

 

Leaving Web safety up to a duel between Mirai and Hajime doesn’t work when it comes to your business. Are you concerned about whether your company can defend itself against DDoS attacks? At Total Server Solutions, our mitigation & protection solutions help you stay ahead of attackers. See our DDoS Mitigation Solutions.

Mobile Mistakes for eCommerce Sites

Posted by & filed under List Posts.

Many of us operate within the business world with a desktop or laptop computer as our primary tool with which we access the web. However, the growth of mobile computing over the last few years has really been astounding. It would be an easy argument that the real face of the internet now is not a PC but a smartphone or tablet:

 

  • According to internet usage tracker Statcounter, which analyzes access to 2.5 million sites, October 2016 marked the first month that mobile traffic exceeded desktop/laptop traffic, at 3% (46.5% smartphone & 4.7% tablet). In 2013, 1 in 4 users (25%) were accessing from mobile; in 2010, 1 in 20 people (5%) were.
  • The number of mobile web users globally (not to be confused with mobile phone users) was expected to exceed 2 billion in 2016 (IDC). Look back just 9 years prior to that in 2007, and desktop had 1.1 billion users vs. 400 million on mobile (comScore). In other words, the mobile web grew roughly 400% during that period.

 

Mobile is clearly a much more important part of business than it was in the past. Many will buy on mobile. Others will conduct research on their phone or tablet before switching to a PC to make their purchase. Either way, an e-commerce company wants to create a strong presence on mobile to beat out their competition.

 

Top Mistakes E-Commerce Companies Make on Mobile

 

Here are thoughts from entrepreneurs on what kinds of missteps e-commerce companies tend to make when aiming to make the most of the mobile web:

 

#1 – Challenging to check out

 

E-commerce companies have generally gotten the idea that you have to focus on showing people exactly how the product looks if you want them to buy. However, for many companies, mobile is simply a reflection of the desktop setup.

 

Be sure that your checkout is optimized specifically for mobile. Optimizing mobile involves “taking advantage of mobile-specific features (like using specific keyboards for different fields), dividing up forms into many more pages and getting rid of unnecessary fields,” notes Shop It To Me founder Charlie Graham.

 

#2 – Frustrating form overload

 

Smartphones and tablets are certainly convenient for internet access, but typing can be a pain. For that reason, Nicolas Gremion of Free-eBooks.net echoes Graham’s point about minimizing fields and forms; plus, he suggests integrating other services that might already contain user information. Allow them to register using their Facebook or Google account. Allow them to pay via Amazon Checkout, Fortumo, or PayPal. Have a checkbox that allows them to automatically transfer their billing info into the shipping section (i.e., without having to re-type it). Test the process carefully for any snags.

 

One key aspect to keep in mind is that users of mobile are not clicking with their mouse but manipulating the screen with their fingers — particularly the thumb. Crazy Egg‘s analysis of this topic suggests there are three main ways that people interact with their smartphones: one-handed (49%), cradled (39%), and two-handed (15%). In all these scenarios, the thumb is critical. Because of that, there is a concept called the Thumb Zone — the area of the screen that is comfortably accessible to the thumb. Roughly speaking, the Thumb Zone is the bottom left-hand corner of the screen. Be aware of that when designing checkout.

 

#3 – Not easy to navigate between products

 

Studies show that more consumers will now purchase from a mobile device, but the process can easily become confusing if you have a broad catalog with numerous categories in your shop. Jonathan Long of Market Domination Media recommends checking out the Best Buy site on mobile to get a sense of a user-friendly mobile experience for a store with a huge range of products. Especially when people are ready to buy (and that describes your ideal traffic), they want to be able to navigate to what they want quickly. Make sure that they can.

 

#4 – Pestering pop-ups

 

You don’t want to ever drill your e-commerce customers with too many pop-ups – and that’s especially critical on mobile. If the average desktop/laptop shopper already seems a bit obsessed with how quickly and intuitively they can get what they need on your site, any sense of patience is gone when that person picks up a mobile device. Hubstaff.com co-founder David Nevogt notes that he will typically abandon a mobile shopping cart if he gets more than 2 pop-ups. “The only exception to this rule is if I’m given the opportunity to sign in via my social accounts,” he clarifies, “because that’s a pop-up that helps me versus a pop-up that asks for my email, which serves the e-commerce company more.”

 

#5 – Really poor responsiveness

 

No one wants to go to your mobile e-commerce shop so that they can wait. A consumer wants to be able to jump around and explore your products rapidly so they can compare options and buy. That requires your site to be strongly responsive. Similarly, user-friendliness is a necessity for mobile, as indicated previously. EVENTup cofounder Jayna Cooke advises to carefully and methodically develop your mobile shop prior to release. Related to responsiveness, it’s critical that you are hosting your site on high-performance infrastructure if you want it to perform at the pace of e-commerce.

 

#6 – Social sharing not set up

 

If you can think of the two most prominent areas of growth on the web, they would probably be mobile and social. Consider these YOY changes in social and mobile social use:

 

  • Between January 2016 and January 2017, the number of active social media users grew 21%, representing an additional 482 million users globally.
  • During that same period, active mobile social use grew 30% — an addition of 581 million people.

 

How can you integrate social prior to checkout? Make it possible for the shopper to ask their friends if they’re undecided on a product, says Allied Business Network co-founder Brooke Bergman. It’s free publicity even if they don’t end up buying.

 

Related: 11 Primary Mistakes Ecommerce Companies Make on Social Media

 

#7 – Relegation of remarketing

 

Don’t be shy about asking for a name and email address early. Once you have that contact info, you can shoot them an email with a coupon code so that they can get a discounted price if they return. As an alternative or supplement to that tactic, you can also use Adwords for remarketing, explains Andesign’s Andrew Namminga, which “will prioritize the delivery of ads to people who have recently visited your website.”

 

#8 – Denial of mobile diversity

 

It’s important to be compatible with every type of mobile device. Any phone or tablet should get impeccably great ease-of-use, notes True Film Production CEO Stanley Meytin. Be sure to test each one.

 

#9 – Absence of an 800 number

 

Of course you want everyone to just buy through the site, but your mobile site should also give the user a fast way to speak with someone at your company directly: a phone number. On a desktop or laptop, people will often check out your FAQ pages or go elsewhere on your site to get their answers. Mobile users desire a straightforward navigation. When they get confused, it makes sense (especially since many are already on their phone) that they would want to simply click to call and get help problem-solving. That phone number is especially important, says LSEO’s Kristopher Jones, because mobile users will often need “a higher level of touch” than their desktop counterparts.

 

*****

 

Do you want your e-commerce company to excel on mobile? At Total Server Solutions, all of our high-performance hosting plans include Unlimited Bandwidth. Learn more.