NAPEO Presentation: The PEO Industry Footprint in 2018

After many hours and much hard work, the folks at NAPEO, with their research partners at McBassi & Company, have completed a detailed analysis of the PEO industry footprint as of 2018.  Because there is no government entity which tracks data specific to PEO, NAPEO takes it upon itself to gather and maintain this data for the benefit of all every few years.  I recently had the pleasure of attending a webinar during which they reviewed and discussed their findings published in their latest white paper titled An Economic Analysis: The PEO Industry Footprint in 2018.  Much of the results of this research effort are very exciting for the PEO industry and are summarized below.

Key Points:

  • Total PEO WSEs (Work Site Employees) is greater than the collective total employment of many of the largest and most successful companies in the US
    • Despite this impressive statistic, still only 12.1% of all WSEs employed by a small business (firms with 10-99 employees) are currently using a PEO
      • This leaves room for significant continued growth in the PEO sector in the coming months and years

  • Rate of growth of PEO WSEs is significantly higher than growth rate of employment in US economy as a whole

Additional details:

  • It is estimated that there were 907 PEOs operating in the US in at the end of 2017
  • This includes 175,000 PEO clients, and
  • Approximately 3.7 Million WSE (Work Site Employees)
    • This employee count is equal to the combined employee count of some of the largest and most notable companies in America

  • Total estimated payroll paid in 2017 for these employees was approximately $176 Billion
  • Growth rate of the PEO industry was approximately 14 times higher than the growth rate in employment in the US economy overall in 2017

  • That being said, only 12.1% of small businesses in the PEO “Sweet Spot” are current PEO clients
    • PEO “Sweet Spot” is defined as companies with 10-99 employees
  • 2017 PEO industry growth rate correspond to a sustained rate of the industry doubling every 9-10 years

  • Below is a list of PEOs counts by state at the end of 2017

You can review a copy of the complete power point from with webinar as well as a copy of the published white paper by clicking on the following links:

The PEO Industry Footprint in 2018 – Power Point

2018-white-paper-final

To view a recording of the webinar, follow the below instructions:

  1. Follow this link to view the webinar:
    1. https://napeo.webex.com/napeo/onstage/playback.php?RCID=70d70d63e45404ce0bd939b1a1b2b533
  2. Select Playback

Many good questions were asked during the presentation, so the video is well worth the watch (runs about 45 minutes).

We extend a sincere Thank You to NAPEO for all the great work they do to advocate for and substantiate our industry on behalf of all PEOs.

Are YOUR Client Companies Profitable?

The business model of many PEO’s includes utilizing the resale of workers’ compensation as a profit margin. For this to be successful, the PEO must understand the liabilities and assets affiliated with each of their workers’ compensation policies and price them appropriately. Both guaranteed cost and loss sensitive platforms have many variables which need to be understood over the course of the policy term to do this successfully. Because of this, understanding profitability at a portfolio or even a policy level can sometimes be a challenge. Understanding the profitability of individual client companies within master policies or with exposures spread over multiple policies adds an additional level of complexity.

At RiskMD we are able to seamlessly solve this problem! By tracking assets (premium) and liabilities (claims) of each client company based on their unique FEIN we are able to understand loss ratios and loss/profit margins on each client company within a given book of business. This holds true regardless of how coverage for the client company is structured, i.e. master policies, MCP’s (multiple coordinated policies), client direct policies or a combination thereof. This also holds true year-over-year regardless of changes in carriers or policy structure for any given client company.

This analysis of each client company can be performed using the carrier’s billed premium or the PEO’s charged premium. This allows us to understand performance of clients and policies as the carrier would view them, giving us greater leverage for negotiating pricing at renewals. Additionally, this allows us to understand client and policy profitability to the PEO itself.

To learn more about RiskMD’s patented process and how to understand YOUR data, contact David Sink at (407)613-5489 or by email: dsink@riskmd.com

The E-merging Risk that Keeps on E-volving: Cyber

As providers of service and insurance to PEO’s, small and medium-sized businesses are the “bread and butter” of clients targeted.

“According to an ISO analysis, 80 percent of cyber breach victims in 2017 were small and medium-sized businesses.” — Neil Spector, president, ISO, a Verisk business

Great article on the current state of cyber from our friends at insurancejournal.com

The E-merging Risk that Keeps on E-volving: Cyber

    6 Reasons Cyber Remains Top Emerging Risk

    Property/casualty insurance experts may not agree on everything but there is a consensus that the most important emerging risk for the industry remains the five-letter word: CYBER. It is not new, of course, but it stays atop emerging risk lists because of its dynamic and pervasive nature.

    Insurance Journal defines emerging risks as those that are new and not yet widely recognized, or perhaps recognized but not well understood. A number of industry leaders explain why cyber remains such an important risk to watch.

    Not Slowing Down

    The number of data breaches and the average costs of cyber-crime are rising every year. These trends show no signs of slowing down. In fact, cyber risk is becoming more concerning as crime-as-a-service gains popularity and artificial intelligence technologies are used more frequently in attacks. Internet of Thing devices are increasing the attack surface and providing more ammo for hackers. One of the more difficult aspects about insuring cyber risk is the dynamic nature of the risk. Just a few years ago, cyber-attacks primarily involved stealing private credit card and health information from large companies. Today, cyber criminals focus on completely different tactics for making money, such as locking out users from computer systems using ransomware, or secretly hijacking computers to mine cryptocurrency. And large corporations aren’t the only targets. According to an ISO analysis, 80 percent of cyber breach victims in 2017 were small and medium-sized businesses. — Neil Spector, president, ISO, a Verisk business

    Keeping Up with IoT

    The biggest risks involve cyber crime. Under “emerging risks,” one of the biggest is the Internet of Things (IoT), and the cybersecurity risks created by billions of interconnected devices. The challenges for agents and brokers multiply in regard to understanding the potential implications, such as IoT devices in homes and businesses — tracking sensors, fire/flooding/intrusion warning devices and more. Agents need to be aware of the questions to ask clients to ensure they are offering complete coverages. They need to be vigilant in keeping up with the IoT devices emerging at an astonishing pace. — Robert Rusbuldt, CEO, Big “I” Independent Insurance Agents & Brokers of America.

    High Severity

    There are many scenarios where cyber risk comes into play, but one example is related to vehicle systems. Luxury automobiles, for example, have up to 150 or more computer programs that impact vehicle performance. Tractor trailer technology is also advancing rapidly, and just one of those systems being hacked could have catastrophic results. WSIA conducts a biennial survey of members regarding emerging issues. Cyber exposure jumped in priority this year, with members agreeing the issue has high severity in terms of current impact industrywide. — Jacqueline Schaendorf, president and CEO, Wholesale & Specialty Insurance Association

    Cyber Property Damage

    One definite area of emerging peril is the threat of substantial property destruction caused by intrusions into sensitive computer networks and connected hardware devices. Long gone are the days where the worst aspect of cyber vulnerabilities amounted to stolen credit card information or lost privacy. Instead, a new breed of cyber exposure is unfolding whereby energy infrastructure facilities and other industrial works have been targeted with cyber attacks causing explosions, wreckage and business interruption. Most expect these risks will soon expand to domestic infrastructure and transportation operations with the prospect of major instances of property damage and life-threatening injuries.

    — Joshua Gold, shareholder attorney, Anderson Kill

    Immature Market

    Cyber comes with a bit of a double-edge sword. On one hand, it is a new market that is growing faster than any other for the industry. But being an immature market means more time is needed to flesh out the data to improve underwriting. Where cyber may be a more interesting market — perhaps even one that helps us peer into the future value of insurance — is how risk mitigation tools are being incorporated into the mix. We are seeing many carriers partner with technology companies in order to assess the actual vulnerabilities within the customers. This presents more stability for underwriting. Customers’ value may evolve in the future toward risk mitigation and resilience building. This would be a shift for an industry that — at least for the past several decades — has based its value on price. — Sean Kevelighan, president and CEO, Insurance Information Institute

    Accumulation Risks

    In a study titled “Advancing Accumulation Risk Management in Cyber Insurance,” global insurance think tank The Geneva Association focused on the danger of accumulation risks as a threat to cyber insurance. The report highlights several cyber accumulation risk challenges:

    • Insurers and reinsurers could underestimate non-affirmative cyber exposure leading to an unplanned shock from a major event. Non-affirmative cyber exposure occurs when a cyber attack causes major losses by triggering coverages in other classes.
    • Data are of insufficient quality, are incomplete and/or lack the necessary consistency for more advanced modeling techniques.
    • Governments predominantly fail to provide frameworks for the sharing of large- scale cyber-terrorism-losses.

    – Anna Maria D’Hulster, secretary general, The Geneva Association

    Workers Compensation & PEO; Not all States are Created Equal

    Image

    According to RiskMD, a proprietary, patented, PEO focused data management and analytics firm, some states consistently outperform others regardless of industry or NCCI hazard code; others may surprise you!

    Here we looked at undeveloped loss ratios and frequency as a function of claim count per $1M in payroll by NCCI hazard code according to the following groupings;

    • Hazard Groups A & B
    • Hazard Groups C, D & E
    • Hazard Groups F & G

    We looked at this data both on national and state specific levels for the past seven years and found that the performance of some states may surprise you. This is based on client company performance aggregated at the state and national levels, regardless of policy structure.

    NY and CA, for example, maintained their lowest loss ratios in Hazard Groups F & G and performed most poorly in Hazard Groups A & B over this seven-year period.

    NJ, MA, and the National Aggregate performed best by both loss ratio and frequency over the past seven years in Hazard Groups C, D & E.

    Performance by loss ratio for TX, FL and IL, increased incrementally as the hazard levels increased, however for these states, the incident of loss (frequency) consistently occurred higher within the Hazard Groups A & B than within the Hazard Groups C, D & D.

    Analyzing your historical data is one of the best ways to predict future trends in your book of business and evaluate appropriate pricing of clients.

    Find out what truths are hidden in your data.

    Speak to a RiskMD representative to learn more.  www.riskmd.com

    RiskMD is Granted a Patent

    A System and Method for Valuation, Acquisition and Management of Insurance Policies

    ORLANDO, September 12, 2018 / — RiskMD is granted a patent for “System and Method for Valuation, Acquisition and Management of Insurance Policies”. The patent focuses on acquiring, valuing and managing workers’ compensation client company exposures regardless of the insurance policy structure. This is the first Professional Employer Organization (“PEO”) specific patent ever issued.Since its inception in 2005, RiskMD has been focused on understanding the diagnostics of the prospective or current coemployed client companies of a Professional Employer Organization (“PEO”) within the overall portfolio of client companies of that PEO.  In order to understand what client companies fit the given portfolio and at what price, we partnered with Appulate to efficiently acquire client data to then apply a proprietary predictive model called “The Barnstable Vintage” to value and thereby price the client company in question.  The vision was “Geico meets workers’ compensation”; acquisition, underwriting, valuation and pricing of a client company based on a pure computer feed with underwriter input only on an exception basis as is shown in exhibit 1 of the patent:

    While there will always be a place for underwriters and underwriting, the consistency of process in acquiring and valuing business is intended to focus the underwriter on the “art” versus “science” of underwriting.  How long in business?  Good neighborhood?  Does the owner throw birthday parties for their staff? This is the art and the mathematical formulas behind the predictive models built provide the science.

    In an effort to properly manage client companies of a PEO regardless of policy structure, the last piece was to understand and then to build a process revolving around a key identifier; the client company Federal Employer Identification Number (“FEIN”).  Cathy Doss, the first Chief Data Officer for Capital One and current Data Officer for Fannie Mae, architected a similar process at Capital One with the Social Security Number as the key identifier and created a similar process for RiskMD.  The combination of these processes are what provides the foundation for this patent and the vision of RiskMD.  The end result is the ability to spin data amongst the three main data pools of a PEO; policy/application data, claims data and payroll/premium data.  Using Tableau as a visualization tool behind the SQL built mathematical formulas, the end presentations look like the below.

    Unlocking PEO client data to make more informed decisions is foremost in understanding how to acquire, value and properly manage insurance policies and the client companies that they insure.  We are passionate about proving out the value and performance of the PEO industry and know that this now patented process will help immensely to that end.  We appreciate all of our clients and carriers support on this effort over the last five years and look forward to further deployment of this tool to the betterment of each party and the industry as a whole.

    “The vision of RiskMD was to make data-driven decisions in pricing and managing PEO client companies regardless of policy structure”, said Mr. Hughes.  “Too much time was being spent diagnosing issues and not enough in treating them.  While our now patented process has been in place for years, it is very satisfying to be recognized by the United States Patent Office for the invention”.

    PEO Super Bowl 2018 – NAPEO

    The industry Super Bowl has started in Phoenix, NAPEO 2018!

    Right off an awesome WCI 360 in Orlando wci360.com, where the PEO industry was once again given a full day of programming in the largest insurance industry conference countrywide, Phoenix Is now the destination for all things PEO’s from today until Friday.  I am sure Mr. Cleary and team have their game faces on and will throw an awesome event as always.  Beautiful facility to start for sure.

    Personally, I am celebrating my 18’th NAPEO and look forward to seeing all of my old friends that also have dedicated their respective careers to PEO.  Can’t wait to rock it out again with you, starting for some of us at the NAPEO Political Action Committee dinner this evening.

    Young Consumers Willing to Let Insurers Spy on Digital Data – If It Cuts Premiums

    As a sociology major and Orwellian it is hard for me to not think about “Big Brother” when reading these types of reports.  My gut would tell me that the younger generation of people that understand data management the most would be most conference about data collection – seemingly not the case –

    The majority of people between 18 and 34 would be willing to let insurance companies dig through their digital data from social media to health devices if it meant lowering their premiums, a survey shows.

    In the younger group, 62 percent said they’d be happy for insurers to use third-party data from the likes of Facebook, fitness apps and smart-home devices to lower prices, according to a survey of more than 8,000 consumers globally by Salesforce.com Inc.’s MuleSoft Inc. That drops to 44 percent when the older generations are included.

    As consumers share more of their personal data online, governments increased their scrutiny of how it’s collected and used following the harvest of 61 millions Facebook users’ accounts by U.K. firm Cambridge Analytica. The European Union’s new privacy law, known as the General Data Protection Rules, took effect on May 25.
    Of the older generations, 45 percent of 35- to 54-year-olds are happy to allow insurers broad access to their digital identity, while 27 percent of those 55 and older would do so.

    Insurers are investing millions improving their digital offerings amid growing competition from fintech startups. But that’s a work in progress: 58 percent of the survey’s respondents said that systems don’t work seamlessly for them, with many citing difficulty filling out a form online. And 56 percent said they would switch their insurance provider if digital service is poor.

    “Insurers are already struggling to deliver a connected experience,” said Jerome Bugnet, EMEA client architect at MuleSoft. That is happening “before even considering how they bring all these new data sources into the equation.”

    Where Will the Wind Blow this Year? …Ask Europe

    As the owner of a coastal home, the start of hurricane season always gets my attention along with the predictive models that come with it.  As an early storm spins in the gulf, the threat of windstorms once against is on the forefront.

    As a data geek, of huge interest is the data pools collected, weights they are given, intervals of understanding them and algorithms produced and interpretations made as a result.

    Out of the shoot some fun facts from our friends at the National Oceanic and Atmospheric Administration (full article at end of blog):

    • “A total of 10 to 16 named storms, tropical-strength or stronger, will likely cross the basin…one to four may become major hurricanes with winds of 111 miles (179 kilometers) per hour or more”
    • “Along the Atlantic and Gulf coasts there are more than 6.6 million homes with an estimated reconstruction cost of $1.5 trillion”

    Unfortunately the past has not fared well for NOAA’s US predictive model (GFS) versus that executed by the European Center for Medium-range Weather Forecasting (ECMWF)  An article from last year that highlights the weaknesses of the US  v European model…  is accessible from the below link with highlights below.

    https://mashable.com/2017/09/14/hurricane-irma-weather-forecast-models-gfs-vs-european/#03UD9HVxAOqI

    •  “The issue gained prominence after Hurricane Sandy struck New Jersey in October 2012, which the European model hinted at at least a week in advance. The GFS model, however, didn’t catch on to the storm’s unusual track until about 5 days in advance”
    • Critics of the GFS say it needs to be improved with greater computer processing power. In addition, they say, the model needs to process weather information in more advanced ways, with greater resolution in both the horizontal and vertical scale, since the weather on the surface depends heavily on what is going on in the mid-to-upper atmosphere.
    • “Michael Farrar, who heads the Environmental Modeling Center (EMC), which is the lead office within the National Oceanic and Atmospheric Administration (NOAA) that develops and operates computer models, said “it’s no secret” that the GFS has been behind the competition. “While it’s continued to improve remarkably over time… it’s consistently behind the European model,” Farrar said in an interview. “

    Because you have a predictive model means you have some basis to understand the future, but not necessarily the best.  The breadth of data ingested along with the timeliness in which it is done along with the proper weightings within are paramount to properly forecasting outcomes.

    “Forecast skill score comparisons, maintained by Brian Tang at the University of Albany, show that the European model was far superior to the GFS model during the long trek that Hurricane Irma took from off the coast of Africa, through the northern Leeward Islands, the Caribbean, Bahamas, Cuba, and then into the mainland U.S.”

    “Here’s how to read this chart: The GFS model is represented by the acronym, AVNO, while the ECMWF is the European model. All the others are models from other countries and groups, such as the CMC, or Canadian model, and the UKM, from the UK Met Office. Also, the acronym, “OFCL,” represents the official Hurricane Center human forecast.”
    To be succinct, this shows we were half as predictive with GFS versus ECMWF.

    Annotated version of model verification scores for weather models' forecasts for Hurricane Irma.

    “For now, forecasters are stuck with a temperamental model that can fail to catch on to upcoming threats until days after the European model has sounded the alarm.”

    As the most innovative country on the technology front, ever… we need to step up our game in predictive analytics on the weather front – volume, velocity and variety – in order to be the world’s front line in understanding the course of “Acts of God”.  For now, the better answers appear to be across the Atlantic.

    What NOAA Forecasts for 2018 Atlantic Hurricane Season

    By | May 25, 2018

    On the heels of the costliest hurricane year on record, the Atlantic is expected to produce five to nine of the mighty storms during the six-month season that starts June 1, the National Oceanic and Atmospheric Administration said.

    A total of 10 to 16 named storms, tropical-strength or stronger, will likely cross the basin, threatening people, real estate, crops and energy resources in the U.S., Mexico and the Caribbean, according to the agency’s annual forecast Thursday. Of those, one to four may become major hurricanes with winds of 111 miles (179 kilometers) per hour or more

    “Regardless of the seasonal prediction, Atlantic and Gulf coast residents need to prepare every year,” Gerry Bell, a forecaster with the Climate Prediction Center, said during a conference call. “There are over 80 million people between Atlantic coast and Gulf coast that can be affected by a hurricane.”

    Hurricane season is closely watched by markets because about 5 percent of U.S. natural gas and 17 percent of crude comes out of the Gulf of Mexico, according to the Energy Information Administration. In addition, the hurricane-vulnerable coastline also accounts for 45 percent of U.S. refining capacity and 51 percent of gas processing.

    Florida is the world’s second-largest producer of orange juice. Along the Atlantic and Gulf coasts there are more than 6.6 million homes with an estimated reconstruction cost of $1.5 trillion, according to the Insurance Information Institute in New York.

    Costliest Year

    Last year the U.S. was hit by three major hurricanes — Harvey, Irma and Maria — that helped drive total losses to more than $215 billion, according to Munich Re. It was the most costly season on record, surpassing 2005 which produced Katrina. Overall 17 named storms formed in 2017, which fell in line with NOAA’s prediction of 11 to 17.

    The forecast is influenced by conditions across the equatorial Pacific. Earlier this year La Nina collapsed and the ocean returned to its neutral state with the possibility of an El Nino forming later this year. El Nino, when the Pacific warms and the atmosphere reacts, ,,increases wind shear across the Atlantic that can tear apart hurricanes and tropical storms, reducing the overall numbers.

    Conditions in the Atlantic will also play a role. Hurricanes need warm water to fuel growth and the basin is currently running colder than normal. Forecasters are currently watching a system in the Gulf of Mexico that may become a tropical depression by Saturday.

    An average to above-average season means there is a greater chance the U.S. coastline and Caribbean islands are at risk, said Bell.

    “When you have a more active season you have more storms forming in the tropical Atlantic and those storms track further westward,” Bell said. “Certain areas have been compromised from last year’s storms that makes hurricane preparedness ever more important this year.”