Omnisend CL

Part
01
of one
Part
01

Omnisend CL

We have completed the attached project spreadsheet as requested. Below are a few notes on our methodology and results.

Methodology

Our initial sweep found several recently-published lists of top email automation platforms suitable for e-commerce. (example 1, example 2) However, we were concerned that these articles, many of which seem to have been written for SEO purposes, may include companies which serve niche markets that aren't within Omnisend's domain. Therefore, in addition to carefully reading each company's summary in our initial vetting process, we took the following steps:

  • We conducted a second sweep for lists which included Omnisend. The companies provided on each were similar though, of course, varied from source to source. (example 3, example 4)
  • We also located a "top" list published by Omnisend itself. We gave these companies priority in our final selection, since they reflect those Omnisend considers its closest true competitors.
  • Apart from Omnisource's article, we gave priority to companies featured in multiple listings which, upon our initial review, appeared to offer similar services to a similar client base. Thus, our final list of nine competitors contained the following:
  • We then conducted our comparative research between the companies, working row-by-row through the project spreadsheet. This made it easier to best compare and contrast each company's details; e.g., a given company's claims might seem generic, but subtle differences in positioning often emerged when compared to nine others.

Some Notes on Our Results

  • Our "overview" of each company is based primarily on their front page, since this represents each company's foot-in-the-door branding. With this providing the framing, we then dug deeper to complete the rest of the project spreadsheet's requirements.
  • Product Descriptions (row 16) are contained in the Overview, Products, and Pricing sections; as the Description section would either simply repeat what has already been provided or require direct product testing that we are unable to accomplish within the limits of a Wonder research request, we have simply stated, "See above," for all platforms.
  • Determining a platform's relative "uniqueness" and "complexity" based on the data available on their website and the available reviews was not an exact science.
    • In general, we found that most platforms' features overlapped and therefore assigned a low-to-moderate uniqueness score.
    • Our assessment on complexity and design are based almost entirely on screenshots and simplified descriptions and should be taken with a grain of salt.
  • None of the platforms research indicated any ability, or desire, to integrate with Amazon Data, and at least one has expressed concern about how much data Amazon is collecting. We hypothesize that this would go against their stated missions to help clients build their own strong ecommerce platforms.
  • Usability, beyond the listing of features and our evaluation of the likely design and usability, could not be determined without sampling the product.
  • Regarding "Alerts," nearly all the platforms automate email alerts to customers based on certain conditions, but we restricted ourselves to alerts to the seller.
  • We were uncertain as to how "KPIs" differed from "Claims" and, so, left row 37 blank.
  • While limited information on each company's tech stack is available from Crunchbase, what is available publicly does not distinguish one company from another and full disclosure is paywall-locked at G2 Stack.
Sources
Sources