speeding up your WordPress - how to optimize performance

Posted by & filed under List Posts.

You want to get your WordPress speed as fast as possible, for a few key reasons:

  • Ecommerce shoppers can be incredibly impatient when it comes to page load time. In fact, the average attention span dropped from 12 seconds to 7 seconds between 2000 and 2016. Since performance is so important, focusing on that aspect of your business will yield more sales.
  • The extent to which people demand a fast load time is backed up by studies. Research has shown that 47% of people who visit a website will bounce away if the site fails to load in just 2 seconds.
  • While we know that people prefer faster sites since they can get what they need faster, the search engines reflect that same perspective: Google and other search engines use speed as a ranking factor.

There is an incredibly small amount of time to show anyone who visits your site what you have and try to get them interested in buying. If your site is not performing at a fast clip, you will spend that tiny piece of time the user is giving you failing to get them what they want. Research from Strange Loop found that there is a 16% drop in satisfaction, 11% fewer page views, and 7% reduction in conversion when page load time is slowed down by just 1 second.

Check your site speed.

To get a sense of how your site performs in different areas, you can use an online tool such as Pingdom to test from different geographical regions.

Streamline and optimize plugins.

You do not want to have any plugins active on your site that are not necessary and holding up their weight. You want to minimize them, and you can sometimes determine the best ones by deactivating them one at a time and checking server speed.

To optimize your plugins, you can check for any inefficiencies in the code. You specifically want to look for any calls to the database that are not needed. “WordPress has its own caching system, so generally speaking, using functions like get_option(), update_option() and so on will be faster than writing SQL,” noted the WordPress Codex.

Use a WordPress caching plugin. 

WordPress sites run a process that looks for necessary information, assembles it, and shows it to your customers – building in that dynamic way each time the site is accessed. Each time someone goes to your site, the server will gather information from your PHP files and MySQL database, compiling it into HTML content to be presented to the visitor.

That retrieval and building can lead to poor performance if there are many people using your site at the same time. Using a caching plugin, just by itself, can make your site 2 to 5 times as fast, per WPBeginner.

Instead of building the site from scratch each time a new user visits, the plugin copies the page when it loads for the first person, sending the cached copy to anyone else who visits. 

Decrease the size of CSS and JS files.

You can speed up how fast your site loads pages by minimizing size of JS and CSS files, as well as by getting the number of their server calls as low as possible. Minification of JS and CSS files is one of the recommendations within Google PageSpeed Insights.

You can manually speed up your site by going through the themes, or use plugins. The one suggested by CodeinWP is Autoptimize. 

Shrink images. 

You want to get the image size down as much as possible without losing any quality. PhotoShop, a Chrome PageSpeed Insights extension, and other software can be slow in reducing image size. Plugins that specialize in this task are preferable. EWWW Image Optimizer, WP Smush, and Optimole are all prominent, well-rated options. Any of those plugins will help you significantly get down the size of your images and, in turn, accelerate your performance. 

Keep your site updated. 

WordPress is updated often. Those updates include security patches, new features, and bug fixes. The plugins and themes you use should have new versions released on a routine basis too.

If your site does not have the newest versions of the core code, theme, and plugins, you will be more at-risk for security issues and will also be hurt by poor speed. For that reason, you want to make sure that the most recent versions of all your WordPress site elements are installed. 

Optimize your themes. 

You can end up with a massive amount of PHP within WordPress – because every time a page loads, a large chunk of code has to be parsed via PHP.

Particularly within shared hosting settings, you can have bad speed on your site because of this parsing process. OPcode caching can be helpful for this aspect. You do not need to reparse the PHP nearly as much because your PHP content is set aside within a cache for temporary holding.

OPcode caching has been provided in the past via third-parties, the most popular of which was Zend. When PHP 5.5 was released, Zend open sourced its code and gave it to the PHP project, so it is now standardly part of PHP.

OPcache is supported by PHP 5.5 and later. Keep your PHP version updated. Newer releases will have additional options for this feature.

Optimize themes.

Themes can put as much as 3 times as much load on the server due to excess and unoptimized database queries – so that is a key point. However, total file characteristics and image files also must be addressed.

  • Practice query optimization and minimization. You may want to hardcode static values into your theme. That can reduce queries. The downside is that you could need to edit the code each time you change it. Your site title and site charset are examples of ones that you could hardcode. You can do the same with menus so that your site does not run wp_list_pages() or similar functions.
  • Get the size and number of your total files down. Minify your JS and CSS files. Create single, optimized files out of various CSS files. Get the number of files that you use to display the average page on your site as small as possible. You can use plugins to aid your efforts.
  • Reduce image files. Check for any images that you do not need, that could perhaps be replaced by text. Check that the images are in the best format for the image type, and that they are optimized. You can also benefit from plugins such as WP Smush.it.

Improve hosting.

As noted in CodeinWP, your first consideration when it comes to trying to speed up your WordPress performance is the infrastructure, which means improving your hosting. Shared hosting is the most inexpensive option, so it is understandable that smaller businesses turn toward it upfront. Shared hosting will not provide strong performance during peak hours, though. There are too many other sites that are tapping into the same pool of resources that you are.  Within a cloud environment, you can set up virtual servers that allow you to maintain strong performance, with better demarcation between the different accounts.

Do you want to optimize the speed of your WordPress site? It is critical to your results. At Total Server Solutions, our cloud hosting boasts the highest levels of performance in the industry. Build your WordPress cloud now.

web development trends for 2019

Posted by & filed under List Posts.

Three-part series (linked as they go live):

Online Growth Trends 2019: Ecommerce

Online Growth Trends 2019: Cloud

Online Growth Trends 2019: Web Development


Forecasters are often criticized because they simply cannot always be accurate – since there is ultimately some guesswork involved. Meteorologists are a great example of a group that is maligned for being inaccurate, when in fact they are incredibly accurate on the whole: SciJinks, from the National Oceanic and Atmospheric Administration (NOAA), noted that 10-day forecasts are approximately 80% accurate, while 7-day forecasts are about 90% accurate. Similarly, predicting trends for 2019 is relatively accurate when sourcing reliably and considering that no increased use of a technology is within a vacuum but is in relationship with other trends. Hence, most of these 2019 trends should hold true despite the fact that we are making prognostications at the beginning of the year.

Increasing need for adaptability 

Since mobile has become such a critical part of the Internet (representing 63% of users according to a 2017 analysis), it is necessary to meet the needs of mobile users as a first priority. Adaptive websites and applications are becoming more common in this climate. Services are easily accessible and can be web-connected immediately in these highly scalable environments. Often apps will be built to operate offline, with the ability to connect and transfer data as needed. Visual web development – i.e., web design – is going through seismic changes too. Adaptability is by no means built into all systems at this point. It continues to become a more crucial guiding principle in 2019.

New coding in JavaScript and PHP7  

JavaScript is often used by developers in their own work, noted Tarun Nagar. While the language has many aspects that are imperfect, it is developing all the time and continues to be used by most organizations worldwide. Turning to PHP, to say that it is popular is a huge undersell; it is actually used on 4 out of every 5 websites (80%), per Linux systems administration journalist Hayden James.  The release of PHP7 has also caused sea changes within development. “Possibility to group import ads as well as Engine exceptions along with anonymous classes is duly added,” said Nagar. Another significant aspect of PHP7 is that it introduced the Unicode Codepoint Escape Syntax.

Website performance has benefited from the improved performance that PHP allows. It has allowed for the huge trends in blockchain systems; P2P trading and exchanges; and the cryptocurrency industry. While organizations have started to realize the many powerful applications of blockchain, cryptocurrency was the first; in order to create cryptocurrency, web developers have to understand the technology completely. Organizations that develop cryptocurrency should also be careful with their user agreements given the risks of the field.

Web design shifts online

The programming language used to be determined by individual coders until HTML5 made JavaScript not restricted to web but a nearly ubiquitous development language.

There are many ways that you can customize your approach to JavaScript by utilizing different JavaScript frameworks. While having various frameworks is not completely aligned with standardization, it is possible to transfer out the basic notions used within a certain framework to another setting, as indicated by Carl Bergenhem. “This shifts the focus to better programming habits and architecture of web applications, rather than being akin to picking your favourite flavour of ice cream,” said Bergenhem.

Because native mobile and web applications are used through the same codebase within Native, NativeScript, and other frameworks, those frameworks will help to draw additional coders to web technologies.

Another reason that coders will increasingly turn to the web is Web Assembly. Coding languages such as Rust, C#, and C++ are able to tap into the web because of this feature. More languages will be able to use the web due to projects such as Blazor, which uses .NET within the web. The language that is used by a developer will cease to matter for web – which means, essentially, that all developers are now web developers.

Progressive web applications (PWAs) will further the abmiguity of the delineation between web apps and native mobile apps. As there is less of a distinct choice that must be made in that regard, coders will be freed not to have to worry about the platform decision and to place their priorities squarely on user experience.

Personalization of AI

The applications of emergent technologies are often discussed in terms of their sexiest representatives. Hence we discuss automation in terms of the driverless car. However, the power of a less glamorous technology such as artificial intelligence (AI) cannot be ignored; and there are plenty of exciting applications for it too, diverse environments in which it will be deployed. This issue is best understood in context. Analytics has traditionally been about logging data and using it for the next version – which is reactive. In 2019, analytics will reach another tier of development with the implementation of machine learning. In this new world, data on how your app is being used will be gathered and used to determine how the site should evolve, furthering the quality of the user experience immediately. This approach is proactive.

What this agility of the app will permit is the capability to deliver UX that is most suitable to the particular person – assuming that there is sufficient data available for the user. This chameleonic aspect will allow you to produce a personalized website that can check the user and then present different tools and functionalities.

Progressive web apps 

Discussed briefly above, PWAs are becoming more prominent based on sharpening understanding of user behavior. These apps are built in a manner than is intended to improve retention and sales by making it easier for individual people to use it. HTML, JavaScript, and CSS are some of the core technologies that allow for progressive web apps.

Again, this idea of adaptability being key to web development is huge in 2019. There is independent updating of the apps, allowing for autonomy. PWAs are also notable because any kind of native user settings or mobile devices will allow for full functionality of the app. The Service Worker API is typically used for automatic updating of PWAs. HTTPS protocol is used to protect data via encryption.

Web accessibility

Web accessibility is an issue that technologists probably too little discuss. The truth is that this need, to practice inclusivity in ensuring those with disabilities do not have difficulty using your site, is becoming more central in 2019. Bergenhem noted that accessibility will continue to become more important to development whether because of governmental regulations or from developers turning more to methods that are inherently more accessible. “Accessibility is essential for developers and organizations that want to create high quality websites and web tools, and not exclude people from using their products and services,” explained the World Wide Web Consortium (W3C).

High-performance infrastructure to accelerate development

Staying abreast of trends in web development can help you to sharpen your skills in the most important areas as time passes. Of course, web development is not just about incorporating approaches into the way you develop but building and operating through fast, reliable infrastructure. At Total Server Solutions, we have brought together some of the best, most high-performance technologies and packaged them to be used together. See our true hosting platform.

cloud computing trends

Posted by & filed under List Posts.

Three-part series (linked as they go live):

Online Growth Trends 2019: Ecommerce

Online Growth Trends 2019: Cloud

Online Growth Trends 2019: Web Development


A September 2018 Gartner report found that cloud computing would expand to $206.2 billion in 2019 at a 17.3 percent compound annual growth rate (CAGR). The bad news is that is a slight downturn in projected growth rate, with Gartner having forecast a 21% growth for 2018. Expansion is still rapid. Given that incredible general growth rate of cloud, the technology is a trend in and of itself. It is becoming so ubiquitous, it is increasingly worthwhile to consider how the field is changing and what that might mean it terms of opportunities for business and organizations.

Top trends in cloud computing for 2019 include the following:

Serverless computing

One IT method that is becoming more prevalent is to sign up for a public cloud with a platform on it, with a fee paid to the cloud host for the platform – a tactic called serverless computing. This service, available through some hosting providers, allows you to use platform as a service (PaaS) via a container through a cloud host, which charges for the platform access. The host handles setup of the physical machines and configuration of the servers.

Serverless computing is attractive to organizations for the same reasons that cloud itself is – the ability to pay on-demand for services rather than having to make capital investments in costly machines and environments. Servers must be purchased, stored, and configured; all that can of course be avoided with serverless computing.

Service meshes

For multiclouds, the network management backplane that will be used will be service meshes such as LinkerD, Envoy, and Istio. The service meshes will allow companies to integrate their private and public cloud environments with on-premise containerized data. Hub-and-spoke and mesh systems will be increasingly used by cloud service providers to allow for easy integration and management of thousands of on-premise networks and virtual private clouds.

AI platforms

Artificially intelligent (AI) platforms are built to operate more smartly, and hence more optimally, than traditional systems. AI functionality is used within big data systems to develop stronger knowledge of how a business functions by enhancing ability to collect strong business data.

You can get work completed faster with AI installed, since it will ensure work is distributed evenly. When data governance standards are integrated, machine learning and AI engineers can be better managed to follow best practices through the platform.

An AI environment can also cut your expenses by helping you to automate some labor-expensive and/or simple tasks (e.g., data extraction and copying), as well as to avoid error duplication. Staff members and data scientists can work together to improve your efficiency and speed if your AI platform is well-designed.


Additional multicloud and hybrid cloud tools will become commercially available. In order to mitigate risk, control costs, and perform migrations quickly, organizations will increasingly want multicloud backplanes, migration tools, and professional services from their cloud providers – accelerating their development. As these functionalities becomes more widely available, transitioning to cloud-native backbones via lift-and-shift, whether for data, workloads, or applications, will grow, noted James Kobielus. More companies will be putting legacy workloads into containers, avoiding the need to rewrite the code. Doing that means that sophisticated migrations can occur without having to assume as much technical risk. Migration to IaaS and PaaS platforms from legacy, on-premise infrastructures will occur as it becomes increasingly affordable to do so.


Cloud-native development does not take care of the already developed on-premise apps, and lift-and-shift is not the only option for those existing systems. Refactoring will become a more broadly used practice too. Prior to designing their infrastructure for multicloud, organizations will often think about how to move workloads and refactor. In order to benefit from native cloud services, organizations will reprogram or refactor instead of using lift-and-shift as much in 2019, according to the analysis of Cloud Technology Partners technology evangelist Ed Featherston.

Shortage of cloud skills 

The cloud carries with it the need for highly specialized skills that are costly, very much needed, and not easy to find. Since that’s the case, the transition to cloud could make the issue with staffing shortages that has been with IT for some time even worse.

According to a report featured in ITProToday, the cloud skills gap is so critical and so substantial, it costs the average large enterprise a quarter of a billion dollars ($258 million) annually. That amounts to 5% of their annual global revenue, on average.

Orphaned resources 

Cloud is easy to adopt, but it can lead to waste. A recent report by RightScale found that 30% of cloud investment is wasted by the average cloud-using organization. People might spin up a cloud service and keep it running even if they do not use it. Cost optimization within cloud will continue to become a key point of focus, so that the wasted spending of orphaned resources can be avoided. 

Cloud data lakes, databases, and warehouses

The greatest challenge for business intelligence and data warehousing has been answered by cloud data stores. Self-service platforms have typically been unreliable, noted an article in AI Business, while clunky schema configurations and slow relational methods in traditional architectures have damaged business access. The Internet of Things, artificial intelligence, and other technologies can benefit from the scalability of cloud data stores – as well as the fact that there is direct access to analytics tools. 

Internet of Everything

Often we talk about the Internet of Things (IoT) in terms of the new world in which virtually everything around us becomes an endpoint of the Web. However, that discussion is often referring to a broader concept, the Internet of Everything (IoE), which goes beyond connected things to also include data, process, and people – as indicated by Angela Karl. “IoE works to provide an end-to-end ecosystem of connectivity that consists of ‘technologies, processes, and concepts employed across all connectivity use-cases,’” wrote Karl, quoting Cisco.

The IoE utilizes data, processes, and machine-to-machine communication in order to learn about how people interact with their environments. A good example of its use is hospitality robots in Japan. The intelligent robots can blink, breathe, make hand gestures, and otherwise behave as humans do. They can speak in Japanese as well as fluent Chinese, Korean, and English. They say hi to guests, interact in real-time, and provide simple services.

Hybrid cloud

Hybrid clouds combine the two models of cloud, public and private. Dataversity forecast that the benefits of hybrid cloud would eventually make it the chief model for cloud.

The obvious upside of hybrid cloud is that it increases your flexibility. The downside is how this model increases complexity. Per the NIST definition of hybrid cloud, it is the combination of various IaaS types. That will often mean blending public cloud and private cloud infrastructures. It can also mean combining public cloud, community cloud, and/or private cloud.

Your cloud partner for 2019

Are you creating a cloud environment so that your organization can benefit from this technology as effectively as possible? Cloud is not just about what you do yourself but about having the right partners you can trust to deliver secure and reliable services. Like the Internet of Everything, that means not forgetting people. At Total Server Solutions, we maintain an around-the-clock staff of experts. Our people make all the difference.

how ecommerce is changing

Posted by & filed under List Posts.

Every year at the New Year, individuals use resolutions to improve themselves – which often means getting into shape or kicking an unhealthy habit. In this way individuals optimize themselves. Since people are thinking along these lines about themselves, it follows that the leaders of businesses are also thinking in this way. Since that’s the case, there is a great amount of discussion of developments and trends in industries and tools that might be of particular use to organizations.

Three of the key categories of trends related to online success (with some overlap) are the focus of this three-part series (linked as the other two parts are published):

We start now with ecommerce trends and strategies to help you stay abreast of and ahead of your competition.

Continued growth

Growth of the industry as a whole can itself be considered a trend that will stay with us in 2019. By 2021, worldwide revenue from online sales is projected to hit $4.88 trillion, per one analysis. Since ecommerce is growing, anyone in that field should assume there will be more and more competition as time passes.

Social selling

According to Global Web Index, social media now takes up more than 30 percent of the average user’s online time daily, accounting for 2 hours and 15 minutes per day in 2017 – up from 1 hour and 30 minutes in 2012. That statistic is critical to ecommerce because it tells you where you can go to engage with customers and prospects. After all, you can spend as much time as you want crafting a message but eventually must figure out how to get your target audience so that they see it.

People are buying on social. Snapchat and Instagram allow you to buy from within stories; plus, there are now purchase buttons that can be added to social media profiles (leading to offers within the platform or on your site). In these ways, social media has created new avenues for ecommerce.

Video marketing

With the rise of social selling, there will also be a continuing rise in marketing through video. Along with 360-degree videos and personalized video, organizations will also further embrace live streaming. 

Concerns with trust and privacy

A fascinating finding from the Global Web Index’s Trends 19 report is that people are “more worried about how companies are using their personal information than the impact the internet has on personal privacy and security in general.” In other words, the specific steps your company takes toward these ends is of special concern to shoppers.

When you look globally, it helps to be aware of the audiences that give privacy particularly high priority. Users in Latin America have greater concern with how individual companies use their data. Users in that region and in APAC are the top two groups for general web privacy as well.

While those perspectives are key to improving worldwide sales, the same perspective by users in the United States and United Kingdom is an even more critical concern of many ecommerce firms. 2018 was a year of sea change in terms of consumer awareness in those nations; 72% of US and UK users (73% the former, 65% the latter) reported greater awareness of organizations’ handling of personal data last year! It is assumed that the General Data Protection Regulation (GDPR) has been important in increasing people’s consciousness of this key Internet issue.

Mobile search 

Most online traffic in the United States is now conducted via mobile. Mobile optimization is a key concern, with mobile search in the news since the July 2018 Google Speed Update –  which made speed a factor in searches from cellphones and tablets. It is therefore critical to implement high-performance computing for mcommerce to succeed at least within that search engine. The implementation of that new search factor stemmed from Google’s finding that there was a massive disparity between mobile loading times and consumer expectations, with the average load time for mobile landing at 22 seconds, far longer than the 3 seconds in which a typical user says that they expect pages to load.

Plus, mobile search will be optimized by ecommerce professionals as they try to adapt their efforts due to how people are changing their search habits, shifting from asking Alexa or Siri a question instead of inputting text into a browser.


When a vegetarian purchases pots and pans online, they would be likelier to respond well to images of people cooking vegetables than meat – and in this way, the decisions are related to the individual’s philosophy. Relevance is fundamental to the value that a user can receive from your site – and you cannot make every person’s interaction with you relevant through a cookie-cutter approach. Customizing the user experience with personalized content is becoming more important every day.  You can modify content to suit the different perspectives of your visitors through adaptive landing pages and dynamic website personalization (DWP).

New realities: virtual and augmented

Undoubtedly you have seen mixed reality technologies on trends list in the past – since their use has been increasing over the last few years, per SchoolSafe cofounder Tiffany Delmore in Entrepreneur. There is evidence of this focus in the Place app from IKEA, which allows you to see what products would look like at your place via augmented reality. A broader tool, Axis, is an ecommerce platform with VR and AR capabilities – and it can be integrated with Magento and other environments.

While many ecommerce organizations are not yet using AR, by creating a more complex and interesting (and personally valuable) shopping experience, companies that do launch AR/VR environments can benefit from the power of these technologies.


Ecommerce can simply be a way to connect a supplier with customers, with you as that link – the strategy of dropshipping. In that scenario, a person orders a product from you for $100. You send a wholesale payment of $60 and the order to the product supplier. You keep $40, and the supplier handles shipping of the product.

Many businesses have already discovered the benefits of this method – with nearly 1 in 5 firms (16.4%) using it out of the 450 surveyed for the 2018 State of the Merchant eCommerce Report. The report found that companies that had implemented dropshipping saw their conversion rate rise slightly (1.74%) and their revenue expand by nearly a third (32.7%).

As noted by CrazyEgg, there are various reasons dropshipping is attractive to ecommerce owners:

  • Overhead is reduced
  • No costs for inventory storage
  • Products are only purchased when orders come in from customers
  • You have less financial risk
  • The dropshipping process is easy to launch.

It is especially compelling to use this method for certain products that are larger to avoid the complexities of shipping and storage of those items.

Your high-performance ecommerce company

To get back to the notion of performance (discussed in the mobile section), improving your site speed is key since it is a general ranking factor for all search. “While page speed is important for your SEO,” noted Edward Toonen, “it is even more important for your UX, conversion, and general customer happiness.”

Beyond developing trends in marketing and other ecommerce methods, you need fast and reliable infrastructure for your ecommerce firm to excel. At Total Server Solutions, we provide high-performance infrastructure and managed services – offering the performance, power, and versatility to meet any ecommerce challenge. Explore our platform.

B2B 5 star customer service -- tips

Posted by & filed under List Posts.

The importance of high-quality customer service and support is critically important in web hosting and other business-to-business sales and relationships. How important is customer experience, particularly for B2B contexts? What steps can you take to improve your service and build a more satisfied base?

The commoditization of business-to-business services has created an environment in which it is more critical than ever to consider the needs of individual business customers. In order to break free of the commodity trap and get stronger loyalty and retention, it is helpful to understand the scope of both rational and emotional reasons that go into buying decisions. For example, someone might make a purchasing decision based on aspects that are often more associated with consumer purchasing, such as the desire to curb anxiety or bolster credibility.

A 2018 study featured in Harvard Business Review looked at dozens of different “elements of value” that are used by B2B buyers to compare options. The elements are organized as a pyramid. Example elements are divided into 10 categories within 5 value types: purpose elements within inspirational value; career and personal elements within individual value; productivity, access, relationship, operational, and strategic elements within ease of doing business value; and economic and performance elements within functional value; all based upon core table stakes (e.g. regulatory compliance).

The focus of this article is the quality of customer service and support, so that falls under the relationship category of business ease. Factors of relationship in the model include cultural fit, stability, commitment, expertise, and responsiveness; the last three of those are all demonstrated by and carried out by high-quality customer service.

Since the customer service quality is so key to determining whether purchases are made and relationships are maintained, it would make sense that B2C firms would invest heavily in this concern – and they do. However, a McKinsey analysis reveals this aspect of the B2B approach is lagging in comparison to B2C: while the latter get typical customer-experience ratings of 65 to 85%, the former get an average rating below 50%. In essence, the B2B experience gets a negative approval rating!

Why service issues can be more problematic in B2B

Part of the reason that the quality of customer service may not be as central as it should be within B2B generally is that this aspect of the purchasing process is typically conveyed in mainstream discussion via B2C examples from hospitality and retail. Those examples are easier to make and use because they are immediately relatable, noted customer service author Micah Solomon. Solomon pointed out how significant a mistake that is by explaining how customer service issues are amplified within B2B specifically, in these three ways:

  • The value of a relationship is usually higher than with consumers;
  • Each individual sale is typically larger than B2C; and
  • A “multiplier effect” is at play in B2B relationships due to their complexity. Each piece of the puzzle influences the overall experience. For instance, the level of service quality provided by a subcontractor can either improve or pollute wholesale supplier relationships.

6 ways to bolster customer service & experience

Key ways that you can optimize your customer service and deliver an improved experience for B2B customers are as follows:

1.) Deliver seamless simplicity.

You want the relationship to be easy of course, so thinking in terms of simplicity of the customer experience is pivotal. When customers are polled for their satisfaction, a typical aspect that is evaluated is the simplicity, as is sometimes indicated by a “customer effort score” or similar metric – which will reveal when there are issues with the simplicity of the system. One key concern for customers is time – a concern that providers do not needlessly waste their time but rapidly help them resolve their issue and move on with their day. “[R]educing customer effort is pivotal to delivering a more seamless and therefore more superior customer experience,” noted Julia Cupman.

2.) Understand what your customers want.

You of course want to give businesses what they want in order to feel comfortable buying. For example, nearly 3 in 5 people (59%) said in one industry poll that they would rather make purchases themselves through online information-gathering rather than talking with a salesperson.

People who are buying for a business from another business are highly focused on optimizing revenue and efficiency. Anxiety is common with these purchases since ultimately opinion will influence what is chosen. A strong customer experience helps alleviate these emotional responses.

3.) Be proactive in solving customer problems.

You do not want the B2B customer to feel any pain, and the best way to target possible painful incidents is preventively. If you focus on finding pain points and forecasting possible customer needs, you will win long-term clients.

Being proactive is not just about letting the customer know about other services and products you offer. It is also about changing the way you communicate, as found in a study released by Osram Sylvania, a lighting firm. The company’s research determined that negative words such as don’t, won’t, and can’t typically led customers to feel dissatisfied. Simply by changing language, the B2B company helped bolster customer mood and experience.

4.) Personalize the buying process.

We all like to make choices about what we get when we shop for ourselves, whether we’re buying technology, food, or anything else – and the same is true of people buying for businesses. You can improve the value of what you offer through a personalized experience, with customized solutions. Otherwise, as indicated above, the commodity trap may keep your organization from differentiating itself. 

5.) Adaptation 

A customer that wants to deliver an extraordinary customer experience will adapt as the environment changes. An organization that is trying to offer customer service that continues to impress must make improvements over time – especially as newer technologies become available that can better meet business needs. Cupman noted that design thinking can be used to better optimize the customer experience in B2B, allowing you to tweak what you offer through reconsidered engineering. Taking this step can have a hugely positive impact on your bottom line. ING is a good example of success in this area, said Cupman, increasing its share price by 15% and its profits by 23% when it applied omnichannel automatic integration to its customer data during a broad system upgrade. Through these improvements, the bank was able to form an environment in which both corporate and retail customers can generate tailored reports and access real-time account overview data. 

6.) Be a consultant. 

Business-to-business buyers often cite an overly aggressive sales perspective as a troubling aspect of current or potential vendors. It is important to understand that B2B firms are considered too pushy. To free yourself from that stereotype, try going out of your way to inform and advise your clients. Doing so pays dividends, with business buyers five times – that’s 400% – likelier than consumers to treat providers preferentially when they give them new knowledge and insights.

The simple path to figuring out what your customers need is to listen. Ask your customers what they want and need. Then focus your educational, consultative approach on those areas.

Excellent people for excellent customer service

Are you looking for excellent customer service in your B2B relationships? The expert team at Total Server Solutions is made up of individuals with the highest levels of integrity and professionalism. Our people make all the difference.

Posted by & filed under List Posts.

Due to the broader use of 4K (aka ultra high-definition) video on social platforms and business websites, along with the higher amount of data being consumed across industries, the content delivery network (CDN) market is growing at a remarkable clip. The compound annual growth rate (CAGR) for the global CDN market was forecast at 32.8% by MarketandMarkets for 2018 through 2023.

Content delivery networks are used both to deliver content to consumers as well as between businesses. These systems are used to improve the performance of games, video, voice, ecommerce orders, mobile content, dynamic content, and static content.

Since CDNs are typically discussed in terms of boosting performance, it is worth considering the value of performance – as indicated by a couple case studies. The BBC determined that 10% of site visitors would abandon their site for any additional second of loading time. Similarly, COOK found that improving their average load time by 850 milliseconds resulted in a 10% rise in pages per session, 7% reduction in bounce rate, and 7% boost in conversion.

You can see why this technology is compelling. Its benefits are even broader than those described above.

Content delivery network benefits

Why exactly are these systems used? Here are the basic reasons the CDN is adopted:

#1 – User experience

You are able to give end users a better experience by accomplishing improved robustness (with the capacity to draw on more than one delivery server) and lower latency (stronger throughput between the delivery server and user, as well as lower round-trip time).

#2 – Cost savings

It can reduce the amount of money you spend each month on web hosting by saving bandwidth and broadly distributing the work. You can save on power, space, data center capacity, and other IT infrastructure costs as files are increasingly handled by the CDN. Additionally, you are able to minimize your bandwidth usage and, in turn, minimize the expense of delivering cacheable files.

#3 – Performance

The distribution and intelligence of a CDN tends to improve performance. Strong performance for users regardless of their location on the globe is achieved because you have your origin server along with replicated server clusters. Within the server clusters are JavaScript files, CSS stylesheets, videos, images, and other static content. The replicated web servers within the CDN will answer the user requests typically, rather than them having to go all the way to the origin machine. Launching a CDN can give you a major boost, depending on where your typical users are located and the amount of content you need to load.

The raw speed improvement of a CDN is by itself incredibly compelling. In an experiment that tested the implementation of a caching plugin (W3 Total Cache) and CDN found that the initial average load time was improved by 1.5 seconds with the addition of each technology. In other words, the case study, published in WinningWP, found that site load times could be improved by 3 seconds with the addition of these two standard best-practice caching measures.

#4 – Better conversion rate

The people who visit your site will be happier if your site is fast, yielding happier customers and more sales.

#5 – SEO-friendly

A key ranking factor that is core to search engine optimization is the speed of your site. You will get better rankings as you deliver your site faster.

This is not new information: Matt Cutts of Google announced back in 2010 that speed was a site ranking factor. Most recently, speed was announced as a ranking factor for the same search engine’s mobile searches in July 2018.

#6 – Web caching

A CDN excels as a way to manage a cache of small, static files, such as JavaScript, CSS files, static images, or animated GIFs. They can also be good for caching audio recordings, video files, and other items that are particularly large and costly to deliver. A key aspect of management is determining when files expire to make room for new ones.

#7 – Request routing

One key benefit of using a CDN is that it involves file storage hardware and servers at numerous places worldwide. When users interact with the system, the request routing capacity allows the request to go to the content repository that is in closest physical proximity to the end-user by leveraging GeoDNS.

#8 – Geoblocking

You are able to set restrictions in terms of where your content is visible based on the location from which a user is accessing. You can make any content unavailable as desired.

#9 – Distribution

A content delivery network is essentially user-friendly because it considers where the user is on the planet. If you are not using a CDN and have all users requesting from you handled by a server in Dallas, people in Asia and Europe will have to make transcontinental hops to get to what your site has to offer. A CDN allows you to get downloads to people much more quickly because the technology leverages local data centers.

#10 – More than one domain

A browser puts a limit on how many simultaneous downloads can occur through one domain. Typically you can only have four connections going at once. Additional files have to wait until one of the four is complete. Since a CDN is located at a separate domain, the browser is able to double the connections (typically resulting in eight). 

#11 – Security

A content delivery network will also improve your ability to defend yourself against cyberattack. It is a layer between you and the Internet, essentially. The traffic will be filtered and greatly improved when it hits your site – with the CDN pulling out the spammers, bots, hackers, and false calls of distributed denial of service (DDoS) botnets. In this way, no one will be able to touch the origin server, the core of your system. Proxy machines might go offline; but users could still get to your site, because it would only damage service related to that single machine. 

Content delivery networks have some built-in defenses against the bogus traffic spikes of DDoS. It helps to remember that these systems were built in order to analyze and properly handle strong fluctuations in traffic. A CDN sends any fraudulent requests to a scrubbing node called a blackhole so that the site does not experience any harm. CDNs have sometimes been able to mitigate small DDoS attacks by simply sending out the requests to the larger network; however, that tactic does not work for mitigation of large DDoS events. When a CDN is bolstered with DDoS protection, it is optimized to prevent your site from ever being driven offline.

A CDN that has a strong DDoS toolset and knowledge will be able to protect web applications and sites from various malicious efforts, keep the load times rapid for users at peak times, and handle out-of-nowhere surges in demand. If you choose a managed service provider for your CDN that also has specialty in DDoS mitigation, you can know that the proxy servers of the CDN are giving you substantially improved security along with its clear performance benefits.

The right CDN to boost your efforts

Do you think a content delivery network could help your online efforts? At Total Server Solutions, our CDN utilizes equipment in over 150 data centers worldwide so that wherever your audience happens to be, they’re always close to your content. Plus, we are specialists in DDoS prevention. Load your content faster!

Posted by & filed under List Posts.

Atlanta, GA, December 13, 2018 – Total Server Solutions, a global provider of managed services, high performance infrastructure, and custom solutions to individuals and businesses in a wide range of industries has formally announced the hiring of Jim Harris as Channel Director. Harris, an industry veteran, is an experienced channel professional with over 16 years of channel experience. At his 3 previous companies, Harris was tasked with developing and overseeing the start-up of their channel programs. He will look to continue and build off that success at TSS by providing the keys to TSS’ extensive platform of custom engineered services to the re-seller community.

“TSS is truly excited to have Jim Harris join us as we continue our mission of providing our IT platform to customers who need global IT enablement. Jim brings many years of channel leadership and industry knowledge to our team. He will help TSS accelerate our go to market strategy as we head into 2019, which will be a break out year for TSS” said Mike Belote, Vice President of Sales at Total Server Solutions. “The channel is important to TSS because it increases our visibility in the marketplace, and gives us the opportunity to build on our solid reputation as a trusted IT leader and partner, servicing over 4000 customers in 31 PoPs around the world, offering Infrastructure as a Service, Network, and Managed Services to companies who need that orchestrated global platform to access and manage IT workloads anywhere in the world.“

Harris will directly engage with the sales team to develop and implement a complete corporate channel strategy for Total Server Solutions translating TSS’ sales goals into channel strategies creating revenue for both TSS and their partners. In addition, he will oversee and administer the selection of channel marketing partners, budgets, and the positioning of all channel related sales activities.

Previously Harris served as National Channel Manager for Stratus Technologies, Peak 10, and Office Depot’s CompuCom division. From New York, he studied at Fredonia State University and Embry-Riddle Aeronautical University in Daytona Beach, Florida where he obtained his pilots license. Jim is married with four children and resides in central Florida.

Gary Simat
Total Server Solutions
+1(855)227-1939 Ext 649

Tucker Kroll
Total Server Solutions

Posted by & filed under List Posts.

The term high performance computing is used both broadly and specifically. It can simply refer to methods that have been used to draw on computing power more innovatively in order to meet the sophisticated needs of business, engineering, science, healthcare, and other areas. In that form, HPC involves gathering large volumes of computing resources and providing them in a manner that is a significant improvement over the speed and reliability of a desktop computer. High-performance computing has historically been a specialty within computer science that is dedicated to the field of supercomputers – a major subfield of which is parallel processing algorithms (to allow various processors to handle segments of the work) – although supercomputers have stricter parameters, as indicated below.

Within any context, HPC is understood to involve the use of parallel processing algorithms to allow for better speed, reliability, and efficiency. While HPC has sometimes been used as an interchangeable term to supercomputing, a true supercomputer is operating at close to the top possible rate for the current standards, while high performance computing is not so rigidly delineated. HPC is generally used in the context of systems that achieve 1012 floating-point operations per second – i.e. greater than a teraflop. Supercomputers are moving at another tier, Autobahn pace – sometimes exceeding 1015 floating-point operations per second – i.e. more than a petaflop.

Virtualizing HPC

Through a virtual high performance computing (vHPC) system, whether in-house or through public cloud, you get the advantage of one software stack and operating system for the system – which comes with distribution benefits (performance, redundancy, security, etc.). You can share resources through vHPC environments. It enables a setting in which researchers and others can bring their own software to a project since computer resources are sharable. You can give individual professionals their own segments of an HPC ecosystem for their specific data correlation, development, and test purposes. Workload settings, specialized research software, and individually optimized operating systems are all possible. You are able to store images to an archive and test against them.

Virtualization of HPC makes it much more user-friendly on a case-by-case basis: anyone who wants high performance computing for a project specifies the core programs they need, number of virtual machines, and all other parameters, through an architecture that you have already vetted. By choosing flexibility in what you offer, you can enforce your internal data policies. Data security is improved. Also, by using this avenue, you are are able to keep your data out of silos.

2018 reports: HPC growing in cloud

Hyperion Research and Intersect360, both industry research firms, revealed in 2018 that an inflection point was reached within the market, per cloud thought-leader David Linthicum. In other words, we are at a moment when the graph is going to look much more impressive for the field. It already is impressive, though: organizations are rushing to this technology. There was 44% market growth in high performance cloud between 2016 and 2017 as it expanded to $1.1 billion. Meanwhile, the rest of the HPC industry, generally onsite physical servers, did not grow at even close to that pace over the same period.

Why is HPC being used increasingly? There are certain projects that especially need a network with ultra-low latency and ultra-high bandwidth, allowing you to integrate various clusters and nodes and optimize efficiency for the speed of HPC. The simple reason that the market for high performance computing has been increasing is speed. In order to target complex scenarios, HPC unifies and coordinates electronic, operating systems, applications, algorithms, computer architecture, and similar components.

These setups are necessary in order to conduct work as effectively and fast as possible within various specialties; with applications such as climate models, automated electronic design, geographical data systems, gas and oil industry models, biosciences datasets, and media and entertainment analyses. Finance is another environment in which HPC is highly demanded.

Why HPC is headed to cloud

A couple key reasons that cloud is being used for HPC are the following:

  • cloud platform features – The capabilities of cloud platforms are becoming increasingly important since people looking for HPC are now as interested in the features as they are in the performance. The features that are available in the cloud make the infrastructure of the cloud more compelling, essentially, since that’s where they are available.
  • aging onsite hardware – Cloud is becoming standard for HPC in part because more money and effort is being invested in keeping cloud systems cutting-edge. The majority of onsite HPC hardware is simply not as strong as what you can get in a public cloud setting. That is the case in part because IT budget limitations have made it impossible for companies to keep HPC equipment up-to-date. Cloud is much more affordable than maintaining your own system. Since cloud is budget-friendly, that means it is getting more business and is, in turn, able to keep refitting and upgrading its systems.

HPC powering AI, and vice versa

The fact is that enterprises are incorporating HPC now much more fervently than in the past (when it was primarily used in research), as noted by Lenovo worldwide AI chief Bhushan Desam. Broader growth is due to AI applications. Actually these two technologies are working synergistically: AI is fueling the growth of HPC, but HPC is also boosting access to AI capabilities. It is possible to figure out what data means and act in just a few hours rather than a week because of HPC components such as graphic processing units (GPUs) and InifiniBand high-speed networks. Since HPC divides massive tasks into tiny pieces and analyzes in that piecemeal manner, it is a perfect fit for the complexities of AI required by finance, healthcare, and other sectors.

An example benefit of HPC is boosting operational efficiency and optimizing uptime through engineering simulations and forecasting within manufacturing. In order for doctors to achieve diagnoses faster and more effectively, doctors benefit from running millions of images through AI algorithms, powered by HPC.

Autonomous driving fueled by HPC AI

To dig further within AI across industry, high performance computing is being used to research and develop self-driving vehicles, allowing them to move around on their own and create maps of their surroundings. Vehicles could be used within industry to perform dirty and dangerous tasks, freeing people to perform safer and more valuable jobs.

OTTO Motors is an Ontario-based autonomous vehicle manufacturer that has clients within ecommerce, automotive, healthcare, and aerospace. In order to get these vehicles up to speed prior to being launched in the wild, the firm runs simulations that require petabyes of data. High performance computing is used in that preliminary testing phase, as all the kinks are fixed. It is then used within the AI of the vehicles as they continue to operate post-deployment. “Having reliable compute infrastructure [in the form of HPC] is critical for this,” said OTTO CTO Ryan Gariepy.

Robust infrastructure to back HPC

High performance computing allows for the faster completion of projects via parallel processing through clusters – increasingly virtualized and run within public cloud. A core piece of moving a workload to cloud is choosing the right cloud platform provider. At Total Server Solutions, our infrastructure is so comprehensive and robust that many other top tier providers rely on our network to keep them up and running. See our high-performance cloud.

cloud feeding data over skyline

Posted by & filed under List Posts.

Cloud computing is used by many organizations to bolster their systems during the holidays. The same benefits that cloud offers throughout the year become particularly compelling during the holidays, when traffic can surge and require resources that either are or are not delivered by your IT infrastructure. How can cloud give your company an advantage during the holidays and meet needs more effectively than can be achieved through traditional systems? What exactly are horizontal and vertical scaling? How specifically does cloud help improve performance? How can cloud testing help in preparation for peak periods?

Why cloud is so powerful for the holidays

Within a traditional model, there are two ways to go, essentially, as indicated by cloud management firm RightScale:

  • Underprovision – If you underprovision, you would be assuming the normal usage of the application at all times. You would be very efficient throughout all typical usage periods. The downside, though, would be that you would end up losing traffic when it would get busy because your capacity would be insufficient. You would be underprepared for peak periods such as the holidays. You could not keep up with the number of requests. Your credibility would suffer, as would your sales.
  • Overprovision – The other option is to launch resources to an extreme degree. You would be able to handle traffic all all times. You would be inefficient with resources, though, because during normal times you would have too many. You would be able to handle traffic throughout peak times such as the holidays, but your infrastructure would be needlessly costly year-round.

Cloud is a great option, and a better option, because of the way the technology is designed – to optimize scalability. It allows you to allocate and deallocate resources dynamically, avoiding the need to buy equipment in order to answer the higher number of holiday requests.

It also allows you to deliver high-availability. In a technological setting, availability is the extent to which a user is able to get access to resources in the correct format and from a certain location. Along with confidentiality, authentication, nonrepudiation, and integrity, availability is one of the five pillars of Information Assurance. If your data is not readily available, your information security will be negatively impacted.

In terms of scalability, since cloud allows you to scale your resources up and down as traffic fluctuates, you are only paying for the capacity you need at the time. You also have immediate access to sufficient resources at all times.

Cloud scalability – why this aspect is pivotal

Scalability is the ability of software or hardware to operate when user needs require its volume or size to change. Generally, scaling is to a higher volume or size. The rescaling may occur when a scalable object is migrated to a different setting; however, it typically is related to a change in the memory, size, or other parameters of the product.

Scalability means you can handle the load as it rises, whether you need a rise in your CPU, memory, network I/O, or disk I/O. An example would be the holidays, any time you run a promotion, or a situation in which you get unexpected publicity. Your servers may not be able to handle the sudden onrush of traffic. Your site or app will not go down, even if thousands of people are using your system at once, when you have enhanced your scalability with cloud. You keep selling and satisfying your customers.

Performance is a primary reason for optimizing your scalability. Scaling occurs in two directions, horizontal and vertical:

  • Horizontal – When you scale horizontally, that means you are adding more hardware to your ecosystem so that you have more machines to handle the load. As you add hardware, your system becomes more complicated all the time. Every server you add results in additional needs for backup, syncing, monitoring, updating, and all other server management tasks.
  • Vertical – With vertical scaling, you are simply giving a certain instance additional resources. This path is simpler because you do not need to do special setup with software, and the hardware is outside your walls. You can simply attach cloud servers that are already virtualized and ready to use.

Cloud’s impact on performance

Two key aspects of site performance are addressed by cloud, site speed and site uptime:

  • Site speed – Page load time, also known as the speed of your site, is one of the ways that your rank on search engines is determined. The speed of the site will impact how well you show up in search; but more importantly, it will improve your ability to meet the needs of those who come to your site. You will perform better based on small fractions of a second in the improvement of speed. There are many ways to speed up your site in conjunction with better infrastructure. Those tactics include getting rid of any files you do not need, using a strong caching strategy, removing extraneous meta data, and shrinking your images.
  • Site uptime — Your uptime is critical in getting you strong sales because you must have your site available to users in order for them to be able to look through products, figure out what they want to purchase, and place orders. When the site is not available, customers will get a 404 page in their browsers instead of what they want. That will mean you are not able to make the sale. It also means you suffer in search engine rankings, which are based in part on availability. You may not be able to sell as much in the future either, since it will frustrate a user to arrive at a 404 page and not be able to complete their shopping. You certainly do not want your site to ever go offline.

Cloud testing in advance 

The basic reason that ecommerce sites fail is that they are not tested appropriately. Companies do not always know how well their sites will perform when they have huge surges of traffic over the holidays in the absence of this testing. Sites would realize whatever performance issues they might have in their servers well ahead of time if they conducted the relevant testing.

To avoid these issues, you want to test. The traditional way that you would go about testing would be with hardware that you hardly ever need. The other option is to use cloud testing.

With cloud testing, you have independent infrastructure-as-a-service (IaaS) providers (aka cloud hosts) provide you simply manageable, scalable resources for this testing. You are able to save money on testing by using cloud hosting to simulate the traffic that an app or site would experience in the real world within a test setting. You can see how your site stands up when it is hit with certain loads of traffic, with various types of devices, according to the rules you establish.

There is another benefit of cloud testing too: it is closer to what your actual traffic model will be since it is outside your in-house network.

Adding cloud for better sales this holiday

Do you think bolstering your system with cloud servers might be right for you this holiday season? At Total Server Solutions, where we specialize in high-performance e-commerce, our cloud uses the fastest hardware, coupled with a far-reaching network. We do it right.

Posted by & filed under List Posts.

Your company may have already invested substantially in General Data Protection Regulation (GDPR) compliant systems and training for experience in the key global data protection law from the European Union (which went into effect in May)… or it may still be on your organization’s to-do list. While the GDPR is certainly being discussed with great fervor within IT security circles in 2018, compliance is far from ubiquitous. A report released in June found that 38% of companies operating worldwide believed they were not compliant with the law (with that number surely higher once those who are unknowingly noncompliant are included).

Who has to follow the GDPR?

Just about every company must be concerned with the GDPR if they want to limit their liability. If you are an ecommerce company, to be clear, you have to follow the GDPR whether you accept orders from European residents or not, as indicated by Internet attorney John Di Giacomo. The GDPR is applicable for all organizations that watch the user behavior of or gather data from EU citizens. An example of something that would need to adhere with the GDPR is when European users can sign up for a mailing list. It also applies if you are using beacons or tokens on your site to monitor activity of European users – whether your company has a location in an EU state or not.

The below are core steps in guiding your organization toward compliance. 

#1 – Rework your internal policies.

You want to move toward compliance even if you are not quite there yet. Pay attention to your policies for information gathering, storage, and usage. You want to make sure they are all aligned with the GDPR – paying special attention to use.

Write up records related to all your provider relationships. For instance, if you transmit your email information to a marketing platform, you want to be certain data is safeguarded in that setting.

It is also worth noting that smaller organizations will likely not have to worry as much about this law as the big corporations will, per Di Giacomo. While the European Union regulators have already set their sights on the megaenterprises such as Amazon and Facebook, “a $500,000 business is probably not the chief target,” said Di Giacomo.

#2 – Update your privacy policy.

Since the issue of data privacy is so fundamental to the GDPR, one element of your legal stance that must be revised in response to it is your privacy policy. The GDPR specifically mandates that its language must be updated. You privacy policy post-GDPR should include:

  • How long you will retain their data, along with how it will be used;
  • The process through which a person can get a complete report of all the information you have on them, choosing to have it deleted if they want (which is known as the “right to be forgotten” within GDPR compliance);
  • The process through which you will let users know if a breach occurs, in alignment with the GDPR’s requirement to notify anyone whose records are compromised within 72 hours; and
  • A description of your organization list of any other parties that will be able to access it – any affiliates, for instance.

#3 – Assign a data protection officer.

The GDPR is a challenging set of rules to incorporate, particularly if you handle large volumes of sensitive personal data. You may need to appoint a data protection officer to manage the rules and requirements. The officer would both ensure the organization’s compliance and coordinate with any supervising bodies as applicable.

In order for companies to stay on the right side of the GDPR, 75,000 new data protection officer positions will have to be created, per the International Association of Privacy Professionals (IAPP).

#4 – Assess traffic sources to ensure compliance.

In European Union member states, there has been a steep drop in spending on programmatic ads. The amount of ads being purchased has dropped in large part because are not very many GDPR-compliant ad platforms (as of June 2018), per Jia Wertz. Citing Susan Akbarpour, Wertz noted that the dearth of GDPR-compliant advertising management systems would continue to be an issue into the future because of a slow transition among ad networks, affiliate networks, and programmatic ad platform vendors away from the use of cost per thousand (CPM), click-through rate (CTR), and similar metrics that use cookies.

Leading up to the GDPR, ecommerce companies have been able to store cookies within consumers’ browsers. The GDPR wants all details related to the use of cookies to be fully available to online shoppers. With those warnings now necessary, the CPM and CPC rates are negatively impacted. Basically, the GDPR has made these numbers an unreliable way to measure success.

#5 – Shift toward creative ads.

Since programmatic ads have been challenged by the GDPR, it is important to redesign your strategy and shift more of your focus to creative. You can use influencer marketing to build your recognition, bolstering those efforts with public relations.

Any programmatic spending should be carefully considered, per Digital Trends senior manager of programmatic and yield operations Andrew Beehler.

#6 – Rethink opt-in.

No matter what your purposes are for the information you’re collecting, you have to follow compliance guidelines from the moment of the opt-in forward. Concern yourself both with transparency and with consent. In terms of transparency, you must let users know why you are gathering all the pieces of data and how they will be used. You want to minimize what you collect so your explanation is shorter. Do not collect key information such as addresses and phone numbers unless you really need it.

Related to consent, you now must have that agreement very directly – the notion of explicit consent. If a person buys from your site, you cannot email them discounts or an ebook unless you have that explicit consent if they are EU citizens. That means you cannot default-check checkboxes and consider that a valid opt-in.

Additionally, you want your Terms of Service and Privacy Policy to be linked, with checkboxes for people to mark that they’ve read them.

#7 – Use double opt-in in some scenarios.

You do not need double opt-in by default to meet the GDPR. However, you do need to make sure that your consent is easily legible and readable so that the people using your services can understand the data use.

An example would be if the person is signing up for a newsletter. The agreement should state that the user agrees to sign up for the list and that their email will be retained for that reason.

The consent should also link to the GDPR data rights. One right that is important to mention is that they can get a notice describing data usage, along with a copy of their specific data that is being stored.

#8 – Consider adding a blockchain layer.

Blockchain is being introduced within advertising so that there is a decentralized layer, making it possible for ecommerce companies to more seamlessly incentivize anyone who promotes them and incentivize users for verification, all part of a single ecosystem.

Blockchain is still being evaluated in terms of how it can improve retail operations, security, and accountability. Blockchain will improve on what is available through programmatic advertising by providing more transparent information. “Blockchain is here to disrupt antiquated attribution models, remove bad actors and middlemen as well as excess fees,” noted Akbarpour.

#9 – Use ecommerce providers that care about security and compliance.

Do you want to build your ecommerce solutions in line with the General Data Protection Regulation? The first step is to work with a provider with the expertise to design and manage a compliant system. At Total Server Solutions, we’re your best choice for comprehensive ecommerce solutions, software, hosting, and service. See our options.