The Ultimate Guide to Statistics of Projects at Product Hunt
Recently, our Popsters project ended up at the Product Hunt’s Home feed. It’s hard to say it was
a mega successful launch but it undoubtedly gave us a lot of useful feedback, ideas and traffic.
After some time from the publication, this idea came up – why wouldn’t we conduct a research
on the projects represented at the platform, similar to those our tool’s users conduct on social
media pages.

In this paper, we present our research on activities at Product Hunt: engagement statistics data
versus various parameters – publication time, subject, specific content and other product
features. As a bonus, in the end we show statistics on which products are still ongoing/supported
and which are not.

Method

The activity data for various products were gathered via Product Hunt’s API as of August 8, 2017. The product operability data were updated on October 15.
The total of 32,657 projects have been analyzed, which appeared in the PH Home feed from the
launch time (November 24, 2013).

The factors we took into account are: date and time of publication, product card images count,
video availability, tags and their count, hunter, upvotes count, relative activity for products (their
position in the Top).

The research did not reveal any clear correlation between the product’s comments count and its
position in the Top, therefore this factor was not taken into account.

Since different dates always saw different numbers of projects published, it appeared impossible
to compare them by position in the Top. To evaluate the project’s proximity to Top 1, we
calculated the relative % of the gained engagement in relation to the day’s leader from the
equation:

Project N’s relative percentage for the day = (Project N’s upvotes count) / (Top 1 Project’s
upvotes count for the day) * 100 %
This value basically tells us how close each project was to Top 1.

To evaluate the project’s viability status, we parsed the websites of all the published products
(mobile apps and other products published on third-party platforms were not taken into account)
and filtered out those with errors in the site’s response (500, 404, etc.), outdated SSL, redirection
to hosting or parking services, “project closed” messages, and also outdated footer (2014 or 2015
as the earliest qualifying date). Those remained were additionally filtered based on the traffic
data from SimilarWeb: we screened out the projects which had the supposed visits count below
5000.

All the calculation were done for Pacific Standard Time.

Results

Activity vs. Time of Publication

Average upvotes count vs. time of publication
Average relative activity (%) vs. time of publication, %
Summarized upvotes by hour