What is GDPR Compliance

Posted by & filed under List Posts.

Is your organization ready for the May 25, 2018, effective date of the General Data Protection Regulation (GDPR)? This short guide gives you a sense of what guidelines it contains, along with whose data it safeguards and who will have to follow the rules.


Understanding the GDPR


5, 4, 3, 2, 1… Second by second, the European Union is counting down the amount of time left until the enforcement of the General Data Protection Regulation begins. The GDPR is a set of stipulations developed by the European Union for the safeguarding of data. It was enacted because European nations were still working with 1995 legislation (Directive 95/46/EC).


The official GDPR site notes that the law is intended to create greater common ground between the different information privacy laws that are currently in force in different countries on the continent. It is also meant to give better privacy rights to citizens. Contained in this regulation are some significant shifts for individuals as well as for organizations that manage or in any way interact with sensitive personal data.


The GDPR is big news in part because it is a long time coming – the result of over four years of negotiating and fine-tuning. The European Commission started outlining its proposed strategies for reforming the treatment of data privacy in January 2012. The idea of this effort was to make sure that the European nations were in good position for the digital era. Although there were other reforms laid out at the time as well, the GDPR was central.


The European Parliament and European Council both passed this new framework in April 2016 – at which point the directive and regulation were made public. Then in May 2016, the EU Official Journal published the GDPR. The GDPR is on everyone’s minds lately in the security and IT fields because we are ramping up to the date when it becomes effective: May 25, 2018. The idea for that two-year stretch prior to the law going into force was that it would give both individuals and businesses ample time to get ready for compliance.


When the law was passed, Digital Single Market VP Andrus Ansip noted that the treatment of the confidential information of the European people had to be based on an educated knowledge that data was being protected against unauthorized access. “With solid common standards for data protection,” he said, “people can be sure they are in control of their personal information.”


What businesses must be GDPR compliant?


All members of the EU have to comply with the General Data Protection Regulation, and it impacts nations outside Europe as well.


In the United Kingdom, many people are confused about this legislation because it was negotiated prior to Brexit. It is essentially being put into effect in the UK via a Data Protection Bill that mandates many (though not all) of the same standards and protocols.


Any companies that are not within the EU but that provide services or goods to European people and/or organizations have to comply with the law. The GDPR is of great interest to all global enterprises, as well as small businesses that are doing business on the continent. Because that’s the case, this issue is high-priority across just about every industry.


How the GDPR changes things


Businesses get hacked and otherwise experience data breaches all the time. Data may be stolen by cybercriminals or otherwise become accessible to unauthorized parties that are not supposed to be able to view it. Assuming that these parties are malicious, the situation can quickly turn into a nightmare.


To guard again these scenarios, the GDPR gives rights to citizens to be able to look at the information that is held by different organizations.


Businesses and agencies need to give people access to their data while meeting certain information management requirements. They can only collect and use data as described within the legislation. Furthermore, firms that manage information have to secure it so that it is not used for nefarious purposes. They must respect the rights of the owners of data as detailed within the law. Otherwise, they can get fined according to the new table released in the law.


Beyond the above parameters, the other aspect that is new is the expanded liability of organizations that handle data on the behalf of others – called data processors under the law (see below).


Data controllers & data processors


The law places the businesses that must meet compliance in two categories: data controllers and data processors. The GDPR’s Article 4 describes these two types of organizations:


  • Data controller: A data controller is an individual, public agency, or another organization (i.e., any company) that, either by itself or in collaboration with outside entities, decides why and how digital information is processed, stored, or otherwise handled.
  • Data processor: A data processor is an individual, public agency, or another organization (again, could be any business) that manages data for a controller. Note that if you are in the UK and the Data Protection Act applies to your organization, the GDPR will probably be applicable as well (since its essence is being implemented).


“You will have significantly more legal liability if you are responsible for a breach,” notes the UK’s Information Commissioner’s Office. Specifically, processors are now liable.


The General Data Protection Regulation makes it necessary for processors to keep records related to information and its management. In this manner, it becomes a much more significant legal concern to follow industry best practices, avoid corporate negligence (failure to use accepted standards for data protection), and make sure that information is actually secure.


Furthermore, GDPR compliance will now apply to all legal agreements between processors and controllers.


Close parallels between HIPAA & the GDPR


From a compliance perspective, these designations are interesting because they are so similar to the law that has developed in the United States related to the protected health information (PHI) that is the subject of HIPAA compliance – i.e., abiding by the Health Insurance Portability and Accountability Act of 1996. HIPAA has always applied to both covered entities (roughly equivalent to the controllers) and business associates (roughly equivalent to the processors). Also, US law requires that a contract called a business associate agreement (BAA) must be signed between every covered entity and business associate, just as agreements must be signed into effect between controllers and processors.


What are the penalties for noncompliance?


There are incredibly strong fines for failure to comply with the GDPR, with violations leading to fines as high as the greater of 4% of annual turnover (total sales) or 20 million Euros (roughly 24.4 million USD).


Incredibly, a recent survey found that 52% of organizations think that they will get GDPR fines, while another report predicted that the new law would result in $6 billion of fines from the European Union in its first year alone.


Your GDPR-compliant hosting plan


Is your organization in need of GDPR compliance? You do if you in any way come into contact with data of European citizens or businesses, whether you are classified as a controller or processor.


At Total Server Solutions, we offer GDPR-compliant hosting. In fact, we previously established our data protection through an audit to meet the service control standards devised by the American Institute of Certified Public Accountants’ Statement on Standards for Attestation Engagements 16 / 18 (SSAE 16 / 18). See our beliefs.

How the IT Threat Landscape Will Change in 2018 – Part 2 of 2

Posted by & filed under List Posts.

<<< Go to Part 1


#6 – Competition with government for identity verification


One thing that should be learned from the compromise of Equifax, according to the Forrester report, is that individual organizations should not be put in the position of providing reliable verifications of identities and protecting the information of consumers – particularly when people are using digital environments for more of their day-to-day needs.


Big banks will get into the identity verification market in 2018, suggests Forrester. Users will also start having the option to use login details from financial institutions to access government systems. Utilizing integrated data from online payments, blockchain will become more prominent as a technology that can aid with verification.


The researchers suggest that reviewing possible services you could use for identity verification is urgent in 2018. The key characteristics that you want in the institution you choose are credibility; data protection protocols and compliance; coverage; and support.


#7 – Victimization of POS systems by ransomware


End-to-end encryption has been more broadly deployed within transaction platforms; thus, point of sale (POS) systems are not as reliable a source to target for credit card information. With that option blocked, attackers are switching to ransomware so that they get money through extortion rather than selling the data. Someone who gets targeted with ransomware might pay the ransom simply because they cannot get into their system.


Forrester urges businesses not to pay any ransom to cybercriminals if you find yourself in this situation. To protect yourself, prioritize your disaster recovery plans. Daily backup should be one key element of your preparation.


#8 – Discrepancy between board understanding & actual situation


The board of the company may not completely have a sense of the technologies that are needed, even if its members acknowledge that digital security is one of the highest priorities.


Durbin notes that a board often feels the CISO is managing everything appropriately. Board members often are not able to communicate exactly what they want because of lack of familiarity with the approaches and options. From the other side, the CISO may not be able to convey exactly what they want or need to the board.


Company boards often think that the information security team and CISO have been able to make strides after confirming boosts to security budgets in recent years. However, it needs to be understood that a 100% rock-solid security approach is impossible. Beyond being clear on the idea that a defensive stance will always have weak points that could be improved, there is also a tendency to set unreasonable timeframes (regardless what the knowledge level that you currently have in-house).

According to Durbin, when the board does not have a good handle on security in these ways, a breach that occurs could have negative impact on the business – but also on the members of the board.


Since the threat landscape is becoming increasingly complex, an information security chief needs to go beyond maintaining a firewall to predicting and being prepared. Data security leaders should be aware of the influence of internal and external issues on the organization and be able to communicate the situation to the board. In that sense, the ISF sees it as critical that the CISO be both a salesperson and a consultant, able to give solid information and to be convincing; just don’t hard-sell so much that you become Alex Baldwin in Glengarry Glen Ross.


#9 – Transition of focus & venture capital from AI to blockchain


Transactional integrity, policy tamper detection, and guarantees of distributed integrity are avenues in which dedicated architectures and cloud technology are being leveraged to better encrypt and secure data using blockchain.


There will be various ways that blockchain is a valuable method for business, per Forrester. Four of the key ones that will be top use cases during 2018 are integrity and authenticity verification for documents; binary reputation checks to defend against ransomware and malware; identity verification (IDV); and certificate provision/authentication.


Back in 2016, security providers all were concerned that there offerings fully incorporated artificial intelligence (AI). In 2018, blockchain will be a similar technology, says Forrester. This year, many tech startups will offer blockchain security. These new organizations will challenge established organizations to adapt and implement blockchain so that the new wave does not have a competitive advantage.


Forrester advises talking with your security providers about their implementation of blockchain.


#10 – Increasing sophistication of security within business


One way that companies are changing to better protect users is that passwords are no longer seen as the ideal way to authenticate access, notes Wayne Rash in his 2018 trends piece for PC Magazine. Use of biometrics will become more common for authentication in business settings. Iris recognition and facial recognition can be used in isolation or as components of multi-factor authentication (MFA).


MFA is a standard approach that is only becoming more widely adopted. In 2017, the basic way MFA took place was with codes transmitted to people’s phones; in 2018, biometrics will become core to these processes. Software that steals user login info will be less effective at organizations that use codes transmitted to phones, smart cards, or biometrics as a means of multi-factor authentication.


Rash points to what he sees as another, somewhat controversial way that security is improving: the declining value and popularity of cryptocurrency such as Bitcoin. Some of the blockchain formulas currently used in cryptocurrency have weaknesses, and law enforcement is finding methods that allow them to monitor the finances as they pass from account to account. In late 2017, a story broke about the cryptocurrency Monero: hackers were using tools devised by and leaked from the National Security Agency to make their efforts more efficient and rewarding. After entering and assuming control of Linux and Windows servers, these attackers were using the NSA programs to distribute their currency mining across the target networks.


Criminals require the stability of cryptocurrency in order for it to ultimately serve their purposes, getting the money into their accounts through ransomware and other tactics –so incidents such as that with Monero are effective in reducing the allure of fraud that uses cryptocurrency as a component.


While there are elements of the security landscape that are brighter than they have been, there continue to be a large volume of diverse and increasingly sophisticated threats. The number of hacks that take place in 2018 will be greater than in 2017, forecasts Rash. Criminals will continue to come up with workarounds that get past protections. Security will become more challenging all the time.


Given the rise in security incidents and its paramount role in supporting the safe growth of your business, it is critical to have a clear and consistent path forward.


In that sense, says Rash, it is key in 2018 to “focus your resources on prevention and on supporting the security efforts of [your company’s security chiefs].”


A secure, high-performance infrastructure


Do you want to protect your internal systems and customers from data breaches? It all starts with an infrastructure that is third-party-verified to meet top-tier security standards. At Total Server Solutions, our SSAE 16 Type II audit is your assurance that we follow best practices for keeping your data safe and available. See our SSAE 18 / SSAE 16 security commitment.

How the IT Threat Landscape Will Change in 2018

Posted by & filed under List Posts.

In late 2016, Forrester forecast that automation and security services would be used increasingly to meet a shortage of tech talent, that greater than half a million IoT devices would be hacked, that compromises of healthcare systems would become as extensive and prevalent as previous ones within retail, and that a significant IT security breach in the Trump administration would be revealed within the initial 100 days. All those predictions came true. Similarly, the Internet Security Forum (ISF) was right with many of its 2016 predictions as well. This two-part security mini-guide looks at thoughts from those two organizations on how the threat landscape will evolve in 2018.


#1 – Expansion of crime-as-a-service


Steve Durbin, managing director of the nonprofit Information Security Forum (ISF), forecast in late 2016 that crime-as-a-service (CaaS) would expand massively in the year ahead as crime rings established more intricate structures, associations, and affiliations that reflect the robust and highly controlled mechanisms of enterprises.


Durbin states that his projection did come true, unfortunately, as crime-as-a-service was the central component of generally increased cybercrime activity. ISF again sounds the alarm this year that CaaS will continue to be a huge concern, with crime syndicates now specializing their efforts to suit niche markets and turning their malicious work into a traded international commodity. Organized crime will sometimes be the basis of companies that have other business functions; in other cases, cybercrime units operate as independent businesses.


A main way that CaaS will be evolving in 2018 is that more people Durbin describes as “aspirant cybercriminals” who are not necessarily adept at hacking will increasingly be able to cause greater damage through services and programs that they purchase.


In previous years, ransomware involved shutting down your IT systems and demanding payment, possibly as cryptoware that encrypted and locked you from data. Once payment was made, the intruder would stop their attack. That expectation depends on trust. Because aspirant hackers have started to use ransomware so much, businesses are – wisely – unlikely to trust that their services will be restored if they pay. Even if services are restored, you may have an issue with the perpetrators coming back repeatedly for additional payoffs. Businesses will become more aware of this issue.


CaaS will also be used through social engineering in 2018. Social engineering methods are a point of concern related to staff training since they are directed at single people instead of the organization. Security is so increasingly centered on the individual user that Durbin says lines blur between the individual and the enterprise; he concludes, “The individual is increasingly the enterprise.”


#2 – More frequent IoT assaults with different goals


The Internet of Things (IoT) was thriving in a sense in 2017, but really only in limited industries and contexts. There will be a terrific growth in the number of IoT devices in 2018.


Understanding and managing that data could lead to huge competitive advantages, boosting the demand for big data analysis.


There is a glaring issue with the IoT, though, as indicated by Forrester. The research firm notes that the rise of the IoT will also spur additional IoT hacking efforts that will have a different intent (related to the IoT devices themselves). The standard way cybercrime has utilized the IoT is as a way to form a botnet of slave zombie devices to use in distributed denial of service (DDoS) attacks. In 2018, attackers will start to become more interested in the data within the IoT devices, stealing it or blocking it to extract ransom.


#3 – Supply chain will continue as biggest issue with risk management


The ISF has long been concerned with the challenge posed to security by the supply chain. Large amounts of critical data may be shared with suppliers, in scenarios that necessarily involve giving over aspects of control to them. It is extremely important to know that the supplier is going to properly treat the data so that it is kept private, secure, and available.


Durbin noted that 2017 saw large manufacturing companies unable to maintain full production after losing access to some of their supplies – so this issue is key.


Furthermore, the notion of a supply chain extends far beyond manufacturing. Every organization has suppliers. You want to understand here your data is and how it is being protected (as with datacenters audited to meet the SSAE 18 / SSAE 16 standard), especially if it is being shared or entrusted to a third party.


2018 will be a year in which companies start to scrutinize their supply chains for full-lifecycle data protection. A proactive security stance will be more widely embraced. Durbin advises using services that have appropriate assurance related to the risk, building your fortress of safeguards out of repeatable, scalable processes. It is crucial to integrate supply chain IT risk management in your buying and vendor management policies.


#4 – General Data Protection Regulation prominent in security conversations


The General Data Protection Regulation (GDPR), a set of rules and standards put together through the European Union, will go into effect in May 2018. There are severe fines and sanctions for organizations that violate the laws set forth, which are generally upholding consumer and end-user protections. The fines really are significant, as high as 4% of yearly worldwide net sales (turnover) or 20 million euros, whichever is greater.


The GDPR is in place for everyone who lives in Europe, and it applies to businesses that are within Europe as well as those who do business in its member nations. GDPR is about safeguarding consumer as well as staff information.


A chief concern recently is that companies have been increasingly monitoring their workforce as a way to guard against internal cybercrime, human error, and hackers with stolen login data. That may be well-intentioned; however, it can also be considered an invasion of privacy from the perspective of anyone on staff.


The law, passed by the European Court of Human Rights in September, stated that organizations have to let any personnel know ahead of time if their email accounts in the workplace will be watched. Additionally, any surveillance that does occur cannot do so at the unreasonable expense of the employee’s privacy. The GDPR additionally related to the privacy and data management of workers and can lead to large fines if its stipulations are violated.


The Forrester researchers advise that these laws are geared toward stopping improper handling of customer data. However, the information of employees is personal data, regardless that it is within the company’s system. Forrester expects regulators to start to focus increasingly on employee privacy.


Durbin notes that the GDPR comes up in virtually every conversation he has related to security with anyone in the world.


#5 – Possible malicious impact on United States midterm elections


Forrester states bluntly in its report that the United States has been failing to address systemic flaws in the voting process, in which computer programs are used for voting, as well as counting, verification, and reporting.


The analyst firm notes that the attacker would not even have to access a voting machine itself. They could “use compromised Windows machines to adjust the voting tabulation results in web-accessible software,” states the report; alternatively, they could modify a database or spreadsheet of totals from individual precincts.


The huge swaths of data that were taken in the attacks on numerous state agencies, the Republican National Committee, and Equifax will make it easier for malicious parties to submit fraudulent votes in areas where the vote is close, says Forrester.


Click here to read Part 2


High-security, high-performance infrastructure


Are you concerned about properly safeguarding the data being entrusted to your organization? In 2018 more than ever, you need IT partners that prioritize security.


At Total Server Solutions, our high-performance infrastructure is adherent with the SSAE 18 / SSAE 16 standard from the American Institute of Certified Public Accountants. See our security commitment.

Pivotal Elements for Ecommerce Success in 2018

Posted by & filed under List Posts.

Here are a few key tips for succeeding at e-commerce in 2018, related to technologies, SEO, and other aspects of business:


Friendlier checkout


No one, of course, wants to get stuck in checkout: for the same reason we avoid the long line at the supermarket, we do not want it to take us 10 minutes to enter card information and get through to the confirmation page. Checkout is getting easier to achieve, and much faster, through wallet apps and mobile payments.


Social media


We all understand how pivotal social media can be for the success of a business. In its early years, this platform was considered more of a side-effort to ecommerce meant to increase awareness of the brand and build relationships. Today, it has become fundamental to ecommerce success. Consider that many people are now buying products straight through Instagram.




Search engines place a great deal of emphasis on the originality of content on a site. In other words, the more high-quality, fresh, creative ideas and images that are presented on your site, the more likely people are to find it.


The issue for ecommerce sites is that they will often have many different products and need descriptions for each. Since it is so daunting to come up with your own content related to these products, you may end up simply reposting stock material from the manufacturer. That approach is detrimental because descriptions are a great opportunity to catch the attention of the search spiders by avoiding the sin of duplicate content. The core rule with original content is not to focus it excessively on sales but to provide information as a free, user-friendly resource. The information you share should be useful and help people to compare and contrast different product options.


Similar to using the same language as the manufacturer, you do not want to use databases and templates that are used elsewhere. These elements will also hurt your rankings because Google and Bing know that you are not the first to use them. Change all content so it is your own, thoroughly reframing and rewording the descriptions. Use appropriate keywords while avoiding keyword stuffing.




Typically the first concern people will have when they want content is text and relevant images for their product or service pages, blogs, and social media. However, video is becoming more dominant. One estimate suggests that video will account for 80% of all web traffic by 2020. It is a way to have something similar to a one-on-one presentation to the customer even though you are in different locations.


Figures suggest that video is powerful enough to result in 97% higher purchase intent and 200-300% higher click-through rates.


There are all kinds of tools and platforms for creating strong and unique video for your offerings. With Slidely, you have an environment that is integrated into your social profiles for immediate sharing. You can find out locations of viewers and how long people stay tuned with analytics from Wistia or similar systems.


No approach is right for every video. However, live videos will generally create a greater boost. Compared to pre-recorded video, live video can drive as much as 300% higher engagement.




Content is something to strategize in volume, but it is also something to consider from a more granular perspective if you want it to yield an incredible impact. Today, companies that excel at user experience and relationship are going beyond simply displaying and describing products to developing a compelling brand story. To craft your narrative, work with content professionals to build them; and then integrate them company-wide, throughout social platforms, email newsletters, order confirmations, and packaging.


Augmented reality


Augmented reality (AR) is a developing and sophisticated way to attract the focus of your target. It is fast and gives your audience a sense of immersion within your brand. Some thought-leaders think that AR will become a bigger part of social platforms in 2018 – and that is almost stating the obvious. An AR feature within Snapchat allows users to “project” their image and include Bitmoji. As with the Place app by IKEA, it is also possible for a retailer to project products within the homes of social users.




Ecommerce automation has become a central concern for merchants. Increasingly, the sophistication of your automation mechanisms will determine if you are able to keep up with competitors.


Automation is a broad task since it is a practice that can be applied diversely. One possible element for automation is fraud (in which you can protect yourself in a similar manner to spreading your message with marketing automation). For automated fraud prevention, ecommerce systems will allow you to set rules that allow you to automatically forward any items that have estimated mid-range risk to the finance department immediately. When risk is high, you could have the automated system respond with a cancellation.


From a general perspective, automation frees up time so that you and your staff are not constantly entangled in mundane tasks.




The time that you can save through automation can be redirected to emergent and ongoing big-picture concerns such as personalization. Personalization allows your display of products and content of emails to perfectly suit the particular person and situation.


Visitor review system


People will often abandon ecommerce sites because they do not trust them. A sense of uneasiness may continue with a person into the checkout process if they do not see any information validating a choice that they are ready to make.


With easily available reviews from other customers, the shopper will get a boost of confidence from the buyer’s perspective. The other positive of customer reviews is that they are user-generated and contain original comments that will add to your SEO power just like producing your own blog articles does.




You can let a search spider know that it should only read specific portions of your site via the robots.txt file. By informing the search engines what pages are relevant for public use, you make it easier on them and save your own bandwidth.

One good use of the robots file is to section off parts of your site for exclusion from these scans so that you can work on the SEO within some areas while continuing to submit the stronger portions for search consumption.


Anchor text with keywords


With internal links, you are best served with keywords in the anchor text. That approach allows better description to users prior to clicking. User experience is improved in this manner, and you will get higher click-through rates.


301 redirects


You may have inbound links from other sites that lead to products you have removed (in turn meaning the page is no longer live). You want to get the positive search juice from those links, though; and you do not want people who click them to end up at dead ends. A 301 redirect will forward people who go to out-of-stock items to other pages that are similar to their needs.


High-performance infrastructure


To implement any 2018 ecommerce strategy, it is critical to have strong hardware and support to back you. At Total Server Solutions, we provide high-performance infrastructure and thoughtfully engineered services that are different, innovative, and responsive. See our approach.

internet of things

Posted by & filed under List Posts.

The percentage of the world’s population using the Internet has grown by huge amounts since 1995:

  • December 1995 – 4% (16 million)
  • December 2000 – 5.8% (361 million)
  • December 2005 – 15.7% (1.018 billion)
  • September 2010 – 28.8% (1.971 billion)
  • December 2015 – 46.4% (3.366 billion)
  • June 2017 – 51.7% (885 billion).

As that population has grown, the field of technology has simultaneously been fueled as a market, and the digital world has rapidly evolved. To have a sense of key trends is to understand how tech is changing so that you and your business can develop a stronger strategic stance and prepare for the years ahead.

Here are 8 of the biggest trends in technology for 2018:

Artificial intelligence (AI)

In order to improve customer experience, redesign business models, and bolster the way decisions are made, AI will be increasingly integrated into business.

A poll by Gartner suggests that the need for AI is recognized throughout industry, while adoption is still accelerating: 59% of companies are researching and developing an AI plan; the other 41% are either testing or have already implemented it.

AI will give businesses that use it a competitive advantage. Narrow AI, machine-learning geared toward performing a very specific function (as in driving a car in a test setting or comprehending language), will be the focus of most growth. General AI, meanwhile, is not seen as the most promising area at present since a broad application is not yet viewed as practical, per Gartner.

Internet of Things

The Internet of Things is a massive area of growth. The devices within the IoT (from smart watches to refrigerators to thermostats to cars) need to be able to collect large amounts of data and connect with other devices to exchange it without the need for any manual interaction.

Just about any device can be smart and connected, making it an IoT endpoint. A quarter of a billion vehicles will be connected to the Internet by 2020, making them all part of the IoT. Within our home, common digital objects such as televisions and personal assistants are connected. More items that are not typically smart, such as yoga mats that track the movement of the body, are joining the ranks.

The expansion of the IoT is mind-bending when you consider that there will be 75 billion devices within it by 2020, according to IHS.

Edge computing

As many companies are continuing to ramp up their cloud adoption, edge computing is also getting a rise in attention. Computing at the edge is gaining popularity in large part because of the degree of speed and performance needed for the IoT. AI-enabled devices, self-driving cars, drones, and various other devices will often communicate at the edge for true real-time processing. Edge computing will by no means surpass or supplant the needs for cloud, though, as indicated by Daniel Newman: “Though edge will continue to be the go-to choice for processing real-time data,” he notes, “it’s likely that the most important and relevant data will still head cloud-ward.”

Distributed trust systems

Distributed ledgers, cryptocurrencies, and blockchain all fall within the category of distributed trust systems, sets of tools that allow for integrity of transactions, through reliable and tamper-evident methods, within a distributed design. Forces fueling this trend are excitement within the press for these approaches and the focus through venture capital – such as the funding of Ripple, Digital Asset Holdings, Blockstream, and Circle (all startups). Although there is a lot of discussion of this method, Forrester Researchers believe that it will be a slow-developing market that emerges over the next 10 years.

Digital twins

Another concept that Gartner is taking very seriously is the digital twin, which is a digital representation of a system that exists in the real world. This concept is important to the Internet of Things because digital twins are connected to the real items, presenting data related to their physical “twins,” adapting when changes occur, streamlining processes, and improving efficiency. Billions of digital twins will exist to pair with the 21 billion IoT endpoints that will be deployed by 2020. The result of these digital twins is that businesses will be able to cut huge amounts of costs over time, with better performance of IoT equipment and maintenance repair and operation (MRO).

An immediate gain from these twins will be in the area of asset management. They will also lead to better understandings of the use of products and how to optimize operationally.

Beyond the Internet of Things, it will become possible to use digital twins as well. The idea is that eventually this practice will be ubiquitous, with digital counterparts related to each element of our environment, and each with AI capabilities. Industrial designers, healthcare executives, online marketers, and urban planners will see gains from this transition to a digital twin era. Entire cities could have digital twins for sophisticated simulations, while medical and biometric data could be used within human models.

Conversational platforms

Rather than the user having to translate intent, that will be performed by the computer as conversational platforms are implemented. A system of this type can answer questions about the weather or allow you to book a specific restaurant. These platforms will become more intricate and widely applied, as for the gathering of testimony from witnesses to crimes allowing digital generation of sketches according to their descriptions. The one issue with these platforms is that the user has to interact with it within the confines of its structure – which can be irritating. The complexity of the way conversational models are built will be a major differentiator, says Gartner, as will the event models and APIs used to work with third-party services to create accurate and meaningful results.


Analytics has grown as data has, and the IoT will create an enormous volume of information, driving further expansion of the market. The data that will come in from the IoT will lead to improvements in the way that products are made, healthcare is delivered, and cities operate – in turn allowing organizations to become more productive and profitable. An example cited by Newman is a company that had 180,000 trucks in its fleet and was able to bring its management cost of them from 15 cents per mile down to 3 cents – an almost absurd efficiency gain. This same type of approach can be used for just about any application that a business might have. Analytics are a big point of focus for large IT companies. IoT analytics are becoming a special point of focus.

Immersive experience

Mixed reality, like augmented reality and virtual reality, is giving us a new way with which to understand our digital environments. Along with conversational platforms, mixed reality systems will create an immersive experience for users. Development platform, application, and system software vendors will all compete for delivery of this model, as indicated by Gartner.

During the next five years, mixed reality will become more prevalent. In this model, the user interacts with real-world and digital objects while staying present in the physical environment.

High-performance infrastructure

Are you wanting to take on some of these 2018 technology trends in your business with high-performance infrastructure? At Total Server Solutions, our services form the backbone of our infrastructure. See our true hosting platform.

2018 Cloud Computing Predictions & Trends Part 2

Posted by & filed under List Posts.

<<< Go to Part 1


Security and privacy will be even more important.


If you are in a compliant industry such as healthcare, you may hear the words Security and Privacy so much that your eyes start to roll back in your head when you hear them. There is good reason for that obsession when a company could be liable for a federal fine and a round of bad publicity – but all businesses should pay increasing attention to this overarching concern. Consider this: the Equifax breach alone impacted 143 million people. That means the general issue of security/privacy is increasingly getting the attention, beyond business, of the popular press and consumers.


Security and privacy are often reasons firms have hesitated to implement cloud. Since that’s the case, let’s look to the opinions of computing thought leaders and analysts. David Linthicum has argued in a couple places for cloud’s strengths; the titles say it all: “The public cloud is more secure than your data center” and “Clouds are more secure than traditional IT systems.”


Similarly, in a 2017 Gartner report on cloud security, Kasey Panetta posits that chief information officers and heads of IT security should set aside any concerns they have about moving forward with cloud. Panetta writes that the research firm found security should not be thought of as a “primary inhibitor to the adoption of public cloud services” because the security provided through well-built cloud systems “is as good or better than most enterprise data centers.” One of the major pieces of evidence that the analyst uses to back its claims is simply the number of attacks on cloud vs. those against legacy systems: compared to breaches of traditional data centers, public cloud implementations of infrastructure as a service, or IaaS (aka “cloud hosting” data centers) are hit with 60% fewer attacks. Perhaps this number is in part because attackers do not want to target systems that are run with extreme attention paid to security tools and monitoring (partially to overcome the concerns of clients related to the technology). Whatever the reason, cloud should now be considered beyond safe – safer than traditional alternatives.


In order to deliver a strong security stance, you want to approach and build protections as a series of layers. A hacker might peel off one layer, but they are still not able to access your data. Any operation that is engineering a public cloud should have extraordinarily robust security layers in place, as can be verified through standard certifications such as SSAE 16 compliance.


With cloud, instead of being able to attack your website directly, a company would have to go through the third-party provider; the effort is instantly more complicated. Furthermore, you can create private clouds and integrate them with your public cloud as desired for additional protection.


Public cloud will power more enterprise apps.


Clint Boulton (cited in part 1) notes in CIO that enterprises have started to host their mission-critical systems in a public IaaS setting. Examples include Dollar Shave Club and Cardinal Health. SAP and other business apps are deployed by other enterprises in public cloud. The first choice for software hosting will continue to transition to cloud, according to Forrester researcher Dave Bartoletti. Bartoletti says that “the cloud is the best place to get quick insights out of enterprise data,” allowing companies to take their innovative thinking and convert it into technology and intelligence.


More cloud lift-and-shift will emerge.


Firms will often have systems running on legacy hardware that they want to move to a public cloud. These companies may not just want cloud but could benefit from help getting there. Ideally they are able to rewrite their code to embrace the dynamic nature of cloud platforms. With a focus on developing technology for easier lift-and-shift, people will be able to affordably perform bulk app moves, making the entire process of switching to cloud faster.


A cloud-based Internet of Everything will become more prominent.


During 2017, smartphones and tablets were used increasingly for communications and ecommerce – causing a surge in both the Internet of Things (IoT) and artificial intelligence, notes WebProNews. In 2018, the IoT will continue to have a dominant presence in computing. However, another type of computing will become critical, with real-time analytics solutions and cloud solutions becoming more sophisticated.


As the cloud computing ecosystem becomes smoother and more robust, the IoE will develop as well in its efficiency and streamlining capacities, because it is reliant on machine-to-machine interactions, the performance of systems processing data, and individual human beings engaging with their surroundings. As a result of this growing field, people will be able to communicate with other devices on a network seamlessly, with smarter information. Plus, these systems will allow deeper and more meaningful conversations between different parties.


Internet of Things


While the Internet of Everything will be a concern in and of itself, it is a little ridiculous to push aside the importance of the IoT. In 2018, companies will use edge computing to better deliver Internet of Things projects. A basic gain of edge computing within a cloud setting is that you lower bandwidth and resource needs by sending analytics findings to central points, rather than transmitting all data in its raw form for processing. Using edge computing in this manner is useful in an IoT setting because the ongoing data is so voluminous. The amount of data is massive and is a strain on servers within a traditional data center. By using cloud instead, a company is able to retrieve whatever data they want, when they want it. Edge computing has become more prevalent both because of its own strengths and because of the ways that AI and the IoT are intertwined. Beyond fueling the development of smart cities and smart homes, IoT is also increasing the use of artificial intelligence platforms. AI tools themselves lead to reduced traffic on your network, faster responses, and better customer retention. Gartner has noted that AI edge computing use cases are starting to appear.


Tools supporting the management of inventory control, workflow, and supply chain networks all now have IoT use cases – which will continue to proliferate in 2018, notes AnalyticsWeek. As companies become increasingly dependent on the Internet of Things, they will in turn have to update their business software for the modern era.




Companies are aggressively moving to containers as a simpler model for code management and migration. Organizations are using containers, among other things, to make it easier to move their apps from one cloud to another, says Bartoletti. Generally, they want to be able to achieve faster time-to-market with a cleaner devops approach.


Cloud hosting systems now recognize that the ability to integrate the use of containers is key.


This emergent method for software portability is essentially a useful technique. However, it will take time to adjust intelligently to the new landscape: networking, storage, monitoring, and security problems will become more apparent as containers become more widely implemented. Often companies will choose a hybrid solution blending private and public components.


Choosing the right cloud


The above look at 2018 forecasts and trends shows us how cloud is changing on the whole, as a market and in terms of trends that are building. While that overview is helpful, it is also important to consider how you will directly meet cloud need for your business.


At Total Server Solutions, you will get the fastest, most robust cloud platform in the industry. We do it right.

2018 Cloud Computing Predictions

Posted by & filed under List Posts.

On November 7th, Forrester Research released its list of top cloud computing predictions for 2018. This release is an event in the sense that it gives the public, individual businesses, and technology thought leadership a better understanding of how this important market and area of knowledge is developing.


After all, cloud computing is growing incredibly fast as an industry and economic segment. $219 billion was spent worldwide on cloud services in 2016, per Gartner research director Sid Nag in an analysis released in October 2017.


Also in 2016, software as a service (SaaS) spending beat Gartner’s previous market projection by $48.2 billion. As they say, but wait… there’s more – more growth forecast for cloud moving forward. Gartner estimates that infrastructure-as-a-service (IaaS), another term for cloud hosting, is currently clocking a head-turning 23.31% compound annual growth rate (CAGR). To get a sense of that percentage rate growth, it helps to look at the historical United States gross domestic product (GDP). From 1948 to 2017, the GDP has grown at an average annual rate of 3.19 percent, according to economic indicator resource Trading Economics.


Cloud computing: market & general predictions


Here are predictions for growth of cloud from that same Gartner report, to keep looking (for a moment) just at the level of market expansion:


  • The total spending figures on cloud services in 2016 will show global expansion at 18.5% – reaching $260.2 billion.
  • Also during 2017, the amount designated for SaaS will rise 21%, hitting $58.6 billion by the New Year.
  • Cloud hosting or IaaS will show an almost ridiculous expansion of 36.6% in 2017. Achieving a total market of $34.7 billion through the end of the year, this segment is faster growing than any other type of cloud solution.


The basic point of the state of cloud and its projected rise is that this field is skyrocketing. However, a stronger takeaway will be the nature of this growth – understanding exactly what is involved, trends, and other forces that are influencing the area. We get a better sense of that broader perspective through highlights from the 2018 edition of Forrester’s annual cloud computing forecast:


#1 – Savvy businesses will become more conscientious about vendor lock-in as many cloud firms will continue to consolidate. (Note that consolidation is particularly prevalent with firms that do not have general hosting packages and other niche specializations).


#2 – Many software-as-a-service (SaaS) firms will offer their services through platforms as well as standalone.


#3 – For those companies headquartered outside of North America, cloud-based platforms will increasingly center their attention on the region, or on the specifications and concerns of a particular sector.


#4 – Open source containerization app Kubernetes will become pivotal for strong cloud environments.


#5 – Many firms will build clouds on-site, accelerating the use of hybrid and private forms of the technology.


#6 – Development of software and platforms will become a greater focus with private clouds as options start to extend more aggressively beyond hosting.


#7 – With the market becoming hotter for management systems, users will be able to get these environments for cloud either piecemeal or as a complimentary feature.


#8 – Fully 10% of traffic will be rerouted to cloud service providers and to colocation, away from carrier backbones.


#9 – Virtual reality or immersive software will become fundamental to increasing speed and generating trackable, sustainable improvements in development of applications.


#10 – The precept of Zero Trust security will near ubiquity within cloud environments as it is accepted as a core best practice. (Zero Trust is a relatively new way, or newly packaged way, in which to understand authentication and security within a site. This rule is sometimes compared to what is supposedly the traditional model for information technology: Trust But Verify. In the IDG publication CSO, security and compliance thought leader Robert C. Covington notes that the established model has been Trust OR Verify. With Zero Trust, the idea is to segment the network into components such as database, web, wireless, and LAN, and then to treat each of them as untrusted despite the fact that they are internal systems.)


Probably the most compelling forecast of all from Forrester is this summary comment: “In 2018, cloud computing will accelerate enterprise transformation everywhere [bold theirs] as it becomes a must-have business technology.”


Top 2018 cloud trends


The above assessment of the market and predictions in terms of how things are changing touched on a few trends, but now let’s look at top trends directly:


Colocation will become more broadly adopted.


IT chiefs are wanting to phase out their internal data centers, notes Clint Boulton of CIO. Some of that business goes to cloud, but a significant amount is being entrusted to colocation.


The use of a colocation center or colo (often a hosting service) makes it possible for a CIO to have its hardware and other infrastructure supported and maintained within a managed environment (which may have the added benefit of easy integration with cloud apps and hosting).


Dave Bartoletti, an analyst at Forrester, explains that deploying a multi-cloud plan is simpler in this context, and that testing different clouds can be as easy as possible without the need to start thinking about migration upfront.


Artificial intelligence and machine learning take center stage.


Cloud environments are developing incredibly based on the insights of artificial intelligence (AI) and machine learning – and these types of systems are available to users as cloud-based systems as well. Tools of this type will make it easier for businesses to analyze their data and make faster, smart business decisions.


Hyperconvergence will be embraced for private cloud.


Some companies, particularly in industries for which compliance is key, are still unsure about moving confidential data and mission-critical services to an outside entity. That remains the case even while many organizations now cite security concerns as a reason to move to cloud. (Thought leaders have suggested that public cloud is generally more secure than on-premise data centers. That belief is reflected in a 2016 survey of 210 information technology leaders. 50.8% of respondents said that the primary motivator behind the migration to public cloud was that the security of the cloud was stronger than in their internal data center.)


Again, despite that shift toward greater trust in cloud security, the risks of having an outside party control the systems that store and access your data can make people feel hesitant – so they turn toward private cloud instead. Putting together a private cloud may be more common now, but it is not as simple as it may sound, says Boulton. It can be challenging and costly to integrate all the necessary components of a strong environment to achieve the ends that we now need in cloud (virtualization, automation, resource tracking, self-service accessibility, standardization, etc.).


In order to make it easier to launch cloud, storage, processing, and networking resources are now prepackaged as hyperconverged infrastructure (HCI) plans. When you want to implement private cloud, Forrester suggests HCI, especially in cases for which speed and immediate scalability are paramount. These boxes are making it possible for firms to provision private clouds.


For further conversation about emergent and growing cloud trends, see the second half of this report, “Part 2: More 2018 Cloud Trends“.


The right cloud for your business


Are you considering your business’s approach to cloud for 2018? At Total Server Solutions, we believe a cloud-based solution should be scalable, reliable, fast, and easy to use. We do it right.

8 Mistakes People Make with WordPress

Posted by & filed under List Posts.

WordPress is not an umbrella technology used by the entire web – but it is pretty close. It underscores 29.0% of all sites assessed in the continually updated Web Technology Surveys market-share data.


As a tool, WordPress is a content management system (CMS) that simplifies website management. Although a CMS is fundamentally centered on content, functionality of the site is expanded through plugins, and design of the site is adjusted through the choice of site theme.


This platform is an extremely dominant brand within the CMS market, holding an incredibly 59.8% of the market share. CBS Local, CNN, NBC, the New York Post, TechCrunch, TIME, TED, and many other sites use WordPress to deliver their message and updates to their audience.


The fact that many people use WordPress also means that many mistakes are made by organizations as they are using the technology to build their sites. Here are 9 of the most common errors companies have made, presented here so that you can avoid them yourself:


#1 – Plugin overload


WordPress is often discussed in terms of its extraordinary flexibility – certainly at the level of its open source code but also at the simple level of quickly enhancing your functionality with plugins. As of this writing, there are 53,033 plugins. Since there are so many of these optional add-on programs, it can be easy to get excited and install many of them that are unnecessary. Here are three basic issues with excessive plugins:


  • Each of them is a security risk that may not be updated as often as you’d like;
  • Generally loading plugins will mean that your site is less lean and fast; and
  • When you update to a new release of WordPress, plugins can cause your site to break (which is why you need to back up before updating) – so the fewer of them, the better.


#2 – Retention of unused plugins


Get rid of plugins that you are not using, and verify that the plugin files are removed from your server. Plugins that a site is not actively using are an unguarded gate: if you are not using them, you probably are not updating them, so security holes arise.


#3 – Not backing up the site


We all have the to-do list items that we “backburner.” Do not let site backup be one of those backburner items.


WordPress developer Nathan Ello frames backup as insurance for web presence; and that is essentially what it is. It is unpleasant and may feel even a bit paranoid to consider the worst-case scenarios – but it is due-diligence that is essential to protection. If you do not have a backup and have not paid for your hosting, the files for your site will be at risk of disappearing (although treatment of these situations is better through more customer-centric hosts).


Beyond what is provided for backup through your arrangement with your host, you can also use a plugin such as BackupBuddy. BackupBuddy is the DIY option, effectively; all support and management could also be handled externally through your host. High-quality backup solutions are readily available to meet this need if you want to leverage the expertise of a specialized third party.


#4 – Thinking a child theme is unprofessional


When you first hear the idea of a child theme, it may sound like a look that is designed in a manner that is so incomprehensible, it must be separately explained to each person who views it: “No – that’s a horse. It’s a submarine!”


To understand child themes in context, WordPress sites use themes for the design. Themes are templates for the site – basically pieces of software that are added to the core WordPress code to make your site look and function in a particular way (still with full access to the open source code). The advantage of themes generally is that they allow you to make your site aesthetically pleasing without having to do anything at the level of the code to install and start using them.


While having access to themes is great, you will almost inevitably reach a point at which you want to customize to really make the site your own. Typically a person will hire a third party to make adjustments to their theme.


Once you have modified a them, you may feel all is well; but in the absence of a child theme, disaster is lurking. In that situation, when a new version of the theme comes out and populates as a Theme Update button within the admin portal, if you do not perform a backup prior to updating, you will “pave over” any tweaks that your paid developer made. That means if any part of your site has been changed by a developer to better suit you specifically, that code may be gone forever. At the very least, it may be missing until you can get it replaced by the coder – during which time your site will look prehistoric in comparison to its status prior to the update.


#5 – Failure to update to the latest WP version


You must be concerned about backing up before you update, yes. However, you MUST update. Updates are better for the speed of your site. They will make it function better, fully supporting all the latest versions of plugins and themes. Most importantly, though, the newest version of the WordPress core code will have all the latest security patches. Set up auto-updates or get management assistance if needed; both of these options are far better than neglecting to update, especially since old versions are such a common vulnerability exploited by hackers.


#6 – Skipping important aspects of customization


Customization is often incomplete or sloppy. Here are elements that often do not get enough attention, according to Laura Buckler in Torque:


  • Favicon – On your web browser, you will see a very small icon right next to the title of the page. The favicon is a powerful way to improve your branding. Try your logo or a modified version of it.
  • Permalinks – Every WordPress site has permalinks to systematize the URLs of pages and posts. Changing from the default structure will be helpful for your search engine presence; this tactic will also help reach on social platforms.
  • Administration – When you install WordPress, you may want to get it up and running immediately. However, it is not secure in the sense that it contains default credentials. Beyond the risk of data compromise, you also do not want to be responding to comments with the nondescript, unbranded username “admin.”
  • Tagline – Your elevator pitch or slogan is your tagline. Out of the box, the tagline for every WP installation is, “Just Another Blog.” The description is not exactly enticing and says more about lack of customization than anything else.


#7 – Category overload


Just like plugin overload is an issue, you can also end up with far too many categories. You want the categories to simply organize your primary topics.


Hierarchize this aspect: you should have categories and subcategories. Allow the categories to define the scope of content at the level of origination; a piece topic simply must fit within one of the categories or subcategories to be viable.


#8 – Overlooking infrastructure


Infrastructure, the “back end” of your site, is often overlooked. Consider this: speed is not only fundamental to engagement but has been a search ranking factor for almost a decade. The performance delivered by the hardware that actually responds to requests from users will be key in determining how strong the user experience is.


Beyond the equipment itself, you also may need help along the way. According to people who have used our high-performance services at Total Server Solutions, we are knowledgeable and quick to respond to support issues. See our testimonials.

How to Help Attorneys Embrace the Cloud

Posted by & filed under List Posts.

Helping attorneys use cloud-based solutions is about explaining why the technology is so valuable – that it has security, speed, access, and collaboration benefits for firms.


While just about every industry will end up using cloud computing environments, it’s growth has obviously been faster in some areas than others. For example, the cloud grew quickly with startups and SMBs, but it took longer for the technology to become popular in the enterprise. Cloud has increasingly become standard practice within healthcare. In fact, the Health and Human Services Department created an extremely thorough, wide-ranging, and fully cited document specifically dedicated to the topic (“Guidance on HIPAA & Cloud Computing).”


Another industry that has been more skeptical in terms of moving to cloud is law. For obvious reasons, law has extreme concerns in terms of protecting clients’ highly sensitive data to the greatest possible degree (as indicated by a datacenter whose infrastructure is verified and certified to meet the parameters of SSAE 16 compliance, short for the Statement on Standards for Attestation Engagements No. 16, a set of principles developed by the American Institute of Certified Public Accountants).


Since it seems that the transition to cloud is increasingly occurring for law firms, it makes sense to figure out how best to help them smoothly make the migration with confidence. Here are a few things it is good to let law firms know when they are considering transitioning to the cloud:


#1 – Adoption rates suggest lawyers want the convenience.


Today, more lawyers are using cloud than ever before. The technology is appreciated within law as it is within all the other fields: it is incredibly convenient and allows you to access your systems from anywhere you can get a web connection.


Part of the reason companies are adopting cloud is that bar associations are helping to further the understanding of how cloud can be used responsibly by attorneys through ethics opinions. Attorneys are taking advantage of cloud platforms for their website hosting and email server; for sharing of files to allow collaboration with internal and external partners; for backing up of HR details; to provide security against intrusion of your networks; to be able to access and manipulate files from a remote place; and to take work off the slate of your IT team (so they can focus on innovation rather than infrastructure and maintenance).


The numbers back up this idea that the distributed virtual network model is becoming central to law: an American Bar Association (ABA) study from 2016 reveals that at least one cloud service has now been adopted by 37.5% of attorneys. That same figure was at 31% in 2015 and 20% in 2014 – so clearly a transition to this form of computing continues to occur. Actually, other figures suggest that adoption by law firms is even more widespread: among firms that that are in the Am Law 200 and answered The American Lawyer’s 2015 Am Law-LTN Tech Survey, 51% of those 79 respondents said they had adopted cloud computing in some form.


#2 – Part of the reason the cloud has become so much more prevalent is that it is becoming recognized more widely as a secure choice.


There is more belief within the legal community that security and privacy are properly delivered within cloud atmospheres. Those law firms that feel security is now considered extremely solid within cloud computing are correct: thought leader David Linthicum calls people who are unsure about cloud technology the “folded arms gang.” In the piece, he convincingly suggests that security is better within cloud environments than it is within traditional on-premise data centers.


#3 – Cloud can be used to enhance your mobility.


The cloud allows you to deliver data seamlessly to smartphones and tablets as needed when you are away from your computer but want to maintain productivity throughout the day. Having your information in the cloud means that it is on a distributed virtual infrastructure (at a remote data center managed by a third party, if it’s a public or managed private cloud) rather than sitting behind a firewall. You are able to get information on-demand, just as your clients are. You are able to share or retrieve files between attorneys in a straightforward, simple, and efficient fashion. You are able to get data back and forth from one party to another without putting it at risk, both when you are sharing materials with clients and when you need to get it to litigation partners within the firm or other attorneys outside it – allowing you to do it right now rather than having to wait to get back to the office.


#4 – You don’t sink money into hardware that loses its value as you go.


If you spend the capital on your own data center for a traditional solution (whether dedicated or virtualized), you are investing in machines that will depreciate over time, gradually becoming obsolete. With the cloud, you do not need to buy the physical equipment – and that equipment is updated and maintained seamlessly over time at a cloud service provider. The cloud provider will manage the equipment. You will not have as big of a price tag upfront to start the system with cloud since that hardware is not needed, as noted in Law Technology Today. Basically, everything is handled behind the scenes, and you are unaware when updates are taking place.


#5 – Cloud lets onsite IT take a breath.


24/7 support is provided through a cloud provider, which can be extremely helpful to a firm that does not have a large IT department (which is true of most). Support that you will get from the CSP includes real-time oversight and checking of systems for active threats. Plus, they will manage the system to maximize the scalability of a plan so that resource distribution is meaningful and fits the needs of users. Service level agreements give attorneys a sense of what will be guaranteed from the provider in the areas of support and service.


Again, as indicated above, you can set up mission-critical cloud apps so that you are able to use your system anywhere you want. By getting faster access to your digital environment, you are better able to move quickly and achieve healthier work-life balance. With cloud systems trending toward greater mobile management, people will have an even simpler time working with their data and systems from any location. In turn, attorneys will better be able to work together to yield better results for all involved.


#6 – You get a platform that is better designed to leverage data analytics.


For your data to have value, you must analyze it. Law firms are increasingly adopting cloud so that they can better run analytics – with cloud tools that improve how they can use what they have at their fingertips for business intelligence, possibly improving their success rate at getting clients to work with them. Cloud systems may also reveal inefficiencies.


Launch legal cloud within an SSAE 16 compliant setting


Are you interested in setting up a cloud solution that meets the needs of your law firm? The SSAE 16 Type II Audit is your assurance that Total Server Solutions follows the best practices for data availability and security. See our SSAE 16 audit statement.

5 Top IoT Challenges

Posted by & filed under List Posts.

Underwriters Laboratories (UL), the 1894-founded certification and compliance company, has truly materialized the testing needs of the internet of things (IoT) era. The IoT – well, the consumer IoT at least – is about the interconnection of computing devices within everyday household objects. Since that’s the case, it would make sense that a strong testing ground for it would be a house.


Enter the UL “Living Lab.” The lab is a two-story residence that provides a space in which devices can interact within a real-world setting so that these environments can operate quickly and coherently, without security compromises or interoperability snags. At the Living Lab, people within the house use various IoT devices to verify that they function in accordance with one another and the external world.


A few of the factors that are of greatest concern to the UL researchers within this environment are typical ones that impact network and device performance:


  • Floor plan – how ceilings and walls might interfere with connection
  • Noise – the influence of ambient noise from residents or other “things”
  • Acoustic elements – the impact of furniture, drapes, rugs, and carpets
  • Other Wi-Fi – additional radiating devices, including nearby Wi-Fi networks, that interrupt your own system’s communications
  • IoT overload – bandwidth consumed by many different devices.


Essentially, this project by Living Lab is allowing them to uncover issues in a sort of “fishbowl” setting. From a more general perspective, the challenges of IoT technology can be understood through a framework provided by Ahmed Banafa of the University of California, Berkeley. His lenses through which technology of the IoT can be understood are security; connection; sustainability and compatibility; standards; and derivation of insights for intelligent action.




Security is a central concern of the internet of things. With all the new nodes come new ways for hackers to find their ways into the network – especially since devices are often not built with strong security in mind (because the IoT is growing so rapidly now, with a focus placed more substantially on function than on data protection).


How critical is security to the IoT? Look no further than this November 8, 2017, headline by Charlie Osborne of ZDNet: “IoT devices are an enterprise security time bomb.” The evidence comes from Forrester Consulting. The analyst’s poll of 603 line-of-business and IT executives at large companies from six nations (including the US and UK) found that 82% of respondents said they would not necessarily be able to pass an audit since they were incapable of locating the devices on their networks that were either operational technology (OT) or internet of things.


Partially due to this lack of knowledge related to the technology, the stresses of the internet of thing are real as well, according to the survey. 54% of people reported that the IoT is a cause of stress: they feel unsure that it has the protection they need.


Curiously enough, the ZDNet piece reveals one of the problems holding back security: unsureness about the IoT itself. Companies typically were not investing large amounts in internet of things projects, in part because executives were still rather reserved on the topic. With tight budgets, 2 out of 5 staff members polled said that their organizations were using traditional tools to protect IoT systems.


“This is a glaring issue for today’s firms, which need crystal-clear visibility into networks where BYOD and IoT are common,” said Osborne.




Connectivity is another basic concern of the IoT that will push us beyond the server/client communication paradigm that we have used previously for node authorization and connection.


Server/client is a model that is well-suited to smaller numbers of devices. With the advent of the IoT, though, networks could require the integration of billions of devices, leading to bottlenecks in server/client scenarios. The new systems will be sophisticated cloud settings capable of sending and receiving massive amounts of information, scaling as needed.


“The future of IoT will very much depend on decentralizing IoT networks,” noted Banafa.


One way that decentralization is achieved is by transitioning certain tasks to the edge of the network, as with fog computing architectures that use hubs for mission-critical processing, with data collection and analytics through the cloud.


Sustainability / Compatibility


Currently, there are many different companies using various protocols that are trying to develop the standards for the internet of things. This free-market competition can boost innovation and options; however, additional software and hardware may be necessary in order to interconnect devices.


Disparities between operating systems, firmware, and machine to machine (M2M) protocols could all cause challenges in the IoT.


The reason that these two elements, sustainability, and compatibility, are discussed under the same heading is that the notion of compatibility is directly linked to the ability for a general ecosystem to survive long-term. Some technologies will inevitably become obsolete in the coming years, which could mean that their devices could become worthless. No one wants their refrigerator to become unusable a year or two after purchase because the manufacturer is no longer open for business.




Standards for data aggregation, networking, and communication will all help to determine processes for management, transmission, and storage of sensor data. Aggregation is critical because it improves data availability (via the frequency of access, scale, and scope) that you can use when analyzing. One concern that will make it harder to arrive at agreed standards within this field is the issue of unstructured data. Information within relational databases, called structured data, can be accessed via SQL. However, the unstructured contents within NoSQL databases are not accessed through one standard technique. Another issue is that companies may not have the skillsets they will need on-staff to be able to leverage and maintain cutting-edge big data systems.


One of the key reasons that standardization will be so helpful to the IoT is simply that it will make everything easier – as noted by Daniel Newman in Forbes. Currently, you cannot simply plug in a device. Instead, apps and drivers have to be installed. The technology should be simpler. Through APIs and open source technologies, IoT manufacturers will be able to integrate their devices with the worldwide ecosystem that already exists. “If these items use the same ‘language,'” said Newman, “they will be able to talk in ways they—and we—understand.”


Derivation of Insights for Intelligent Action


Finally, the IoT must have takeaways. Cognitive technologies are used in this setting to improve analysis and spark more powerful findings. Key trends related to this field include:


  • Lower cost of data storage: The volume of data that you have available will make it easier to get the results you want from an artificial intelligence (AI) system, especially since storage costs are lower than in the past.
  • More open source and crowdsourced analytics options: Algorithms are developing rapidly as cloud-based crowdsourcing has become prevalent.
  • Real-time analytics: You are able to get access to data that impacts your business “right now,” with real-time analysis through complex event processing (CEP) and other capabilities.


High-Performance Infrastructure for IoT


The above challenges are certainly not holding back the forward momentum of the internet of things. As it expands, strong and reliable cloud hosting will be fundamental to the success of individual projects.


Are you in need of a powerful cloud to back your IoT system? At Total Server Solutions, we engineered out cloud solution with speed in mind, and SSD lets us provide you with the high levels of performance that you demand. Get the only cloud with guaranteed IOPS.