Tell HN: Fiverr left customer files public and searchable
Fiverr (gig work/task platform, competitor to Upwork) uses a service called Cloudinary to process PDF/images in messaging, including work products from the worker to client.
Besides the PDF processing value add, Cloudinary effectively acts like S3 here, serving assets directly to the web client. Like S3, it has support for signed/expiring URLs. However, Fiverr opted to use public URLs, not signed ones, for sensitive client-worker communication.
Moreover, it seems like they may be serving public HTML somewhere that links to these files. As a result, hundreds are in Google search results, many containing PII.
Example query: site:fiverr-res.cloudinary.com form 1040
In fact, Fiverr actively buys Google Ads for keywords like "form 1234 filing" despite knowing that it does not adequately secure the resulting work product, causing the preparer to violate the GLBA/FTC Safeguards Rule.
Responsible Disclosure Note -- 40 days have passed since this was notified to the designated vulnerability email (security@fiverr.com). The security team did not reply. Therefore, this is being made public as it doesn't seem eligible for CVE/CERT processing as it is not really a code vulnerability, and I don't know anyone else who would care about it.
Extremely bad stuff here. Can't believe it's been 7 hours now and you can still pull up people's complete prepared tax returns right from a Google search. This should be a business-ending breach of trust and good practices, but I worry there's probably a lack of regulatory might or will to make anything happen.
It's very unfortunate but a significant amount of the most damaging stuff in this is from the underprivileged and those with minimal means who were trying to find help they could afford. Non-profits trying to get website help, confidential reports for charities trying to get translations, children seeking therapy (fiverr has a therapy category!?) for some truly dark stuff.
Utterly inexcusable that this is still up after so many hours.
Technically, 40 days and 7 hours!
...and forty nights...
...since you leaked my data away
It looks like they (cloudinary?) blocked the content.
Each result from the query site:fiverr-res.cloudinary.com form 1040 returns 404
Yikes! It should not require the service provider to block PII, but at least someone plugged the leak.
The company put out its first statement:
> “Fiverr does not proactively expose users’ private information. The content in question was shared by users in the normal course of marketplace activity to showcase work samples, under agreements and approvals between buyers and sellers. This type of content requires the buyer’s explicit consent before it can be uploaded. As always, any request to remove content is handled promptly by our team."
https://sqmagazine.co.uk/fiverr-security-flaw-private-docume...
It sounds like they are trying to claim the users involved published the links and that's why they are on Google? But how could anyone believe that multiple users intentionally published their SSN?
Re the takedown, I'm also guessing it's from Cloudinary. Maybe HTTP Referrer based?
The DMCA takedown also suggests at least one user was not aware of that file being public. This all comes down to what that "sharing" action specifically looked like.
ChatGPT recently had a similar case with the sharing feature on conversations leading to publicly indexed convos. That incident would have also matched the implied definition of sharing here.
I am a freelancer on Fiverr, this is VERY concerning. The amount of PII that I have sent over Fiverr, after sending NDA's is potentially all out in the public. I hope there will be accountability for this. IMO Fiverr has had terrible management for years! They simply do not care about their freelancers (and apparently also not about their customers).
Amazing that a company whose whole brand is based on hiring people for $5 turns out to not respect the workers who created value on their platform.
I get your criticism, however, there was a lot of talent working on the platform for many years. I was averaging 150 USD per project, significantly more than 5 USD.
Last year Fiverr started to push AI to the detriments of their freelancers, as well as a new "success score" metric, but never specifying how these metrics are calculated, making it very hard for freelancers to do something about it. This caused many accounts to "lose value" and thus rank lower on searches, causing a drop in income.
I've reported this on the Fiverr Freelancer Forums, let's see how long my post stays up...
There were also a ton of ArtIsts of the "Take the job, generate it with AI, throw the slop" rip and run kind of artistry.
Even before AI I remember reading that many of the "custom designed" logos on Fiverr were just ripoffs of existing trademarks.
There are sites where you can buy 200 different shape stencils for $100, and most logos are just those with text added.
When I found out years down the track that I paid like $1000 for a “premium experience” to be offered 6 or so stencils like this, I was pretty furious. Luckily, I picked none of them, and made the artist draw it exactly as I later described.
> This caused many accounts to "lose value" and thus rank lower on searches
wouldn't this make some other accounts rank higher on searches then? I mean it couldn't have been a problem that affected absolutely everyone so for someone it must've been a positive change.
I assume it did not affect everyone indeed. However, a lot of "Top Rated" sellers, who were raking in good amounts of money saw their income fall drastically, while still providing the same quality of work. The only thing that changed was their Success Score being lower.
It's very hard to improve a metric when you don't know what are the criteria affecting the metric. I've reached out to Fiverr regaring this and they never bothered to tell anyone what impacted your Success Score. "Just do better", they told me.
One founder of Fiverr's LinkedIn photo is in a racecar and posted about supply chain security a week ago.
The other also runs an insurance company (Lemonade) and just posted his drink to celebrate their 1B customers.
I never used their platform but tried a couple jobs on Upwork and drove Uber for 1000 trips. It is absolutely enraging how the CEO class lives day to day like they are some sort of "visionary" for taking a cut of other people's work while taking zero responsibility for even their own app's quality.
At one point the Uber app still told you to call a phone number for some support paths that had a recording telling you to use the app instead. Companies have systematically cut any kind of support, testing, and apparently security.
This also ties in nicely with the Delve debacle about how perfunctory those security certifications are.
Update: Fiverr denies allegations of a cybersecurity incident on X.
“To be clear, this is not a cyber incident. Fiverr does not proactively expose users’ private information. The content in question was shared by users in the normal course of marketplace activity to showcase work samples, under agreements and approvals between buyers and sellers. This type of content requires the buyer’s consent before it can be uploaded. As always, any request to remove content is handled promptly by our team.”
this is because you copied the link with the token, token is generated for your logged in user. strip the token and it wouldn't work. other users does not have your token.
That's true, I just checked. I will edit my post, thanks!
You send the bearer token as a GET argument?
I’m sure there’s a good reason for that. I do it, in a server that I publish for general use, but won’t do it, for the server that I control, as I make sure that it reads headers.
Some PHP servers ignore auth headers (and, I suspect, other APIs), so you need to set general-purpose frameworks and servers to use GET arguments, but that’s a security issue, for exactly the reason you state. Too easy to leak logins. If you use headers, then copy and pasting URLs won’t leak logins.
In any case, the token should be timed, but that’s a fairly weak precaution.
Software development jobs are too accessible. Jobs with access to/control over millions of people's data should require some kind of genuine software engineering certification, and there should be business-cratering fines for something as egregious as completely ignoring security reports. It is ridiculous how we've completely normalised leaks like this on a weekly or almost-daily basis.
They may be part of it, but as a publicly traded company, there's got to be a at least a few people there with a fancy pedigree (not that that actually means they are good at their job or care). But if such a test existed, they presumably would have passed it.
They also have an ISO 27001 certificate (they try to claim a bunch of AWSs certs by proxy on their security page, which is ironic as they say AWS stores most of their data while apparently all uploads are on this).
A while ago I had a customer come to me who had a simple Shopify site and fell for a phishing type of attack where someone simply had an email like "shopify_security at gmail" and kept telling her she needed to apply all kinds of changes. They laundered the payments through Fiverr.
Then they would install WordPress plugins to make the site worse and claim even more "work" was needed.
I documented the entire thing, including my own credentials, and sent it off to Fiverr. Fiverr's response was everything was fine and there was nothing they could do about it, even though it was obvious fraud.
Google never did anything about it either, nor did Shopify.
Given how they handled such a minor situation like that... I guess it shouldn't be surprising they're just asleep at the switch for a major one like this.
> But if such a test existed, they presumably would have passed it
Sure, and now they could have their credentials revoked, potential be legally liable, and never find work in this field again which would prevent them from cocking up another company this way
At least I'm sure LLM tools deploying code to production won't result in this happening more frequently. "Make sure it's secure. Make no mistakes."
"You were right, mistakes have been made!"
Teachers have to be licensed and keep up on licensing.
Plumbers. Electricians. Lawyers. Doctors. Hell, I have to get a license to run my own business.
Why shouldn't software come with a branch for licenses if you're working with sensitive data?
We're going the other way: now any random vibe coded slop is the norm.
Normalize "vibe-plumbing"
Both plumbing and wiring are “easier” in a way than programming-as they’ll violently and potentially explosively let you know if you messed up; whereas programming lets you be blissfully unaware until you see your data plastered across the nightly news.
Wiring mistakes can kill or burn down a house months or years after they have been done. You will not notice unconnected protective earth or badly dimensioned circuit breakers until something else breaks and the protective element is not there.
There are many failure cases that are slow. Especially with water. Let say they sag a bit and connection is poor. It might slowly start leaking over months causing structural damage or at least dampness and microbiological effects.
I'm in the midst of renovating a house at the moment, it's about ten years old.
The plumber siliconed all the shower valves to the fiberglass walls without screwing them to a backplate.
Unsurprisingly the builder is now out of business.
https://structuretech.com/?s=new+construction and the "That Ain't Right" guy https://www.tiktok.com/@gold.star.inspections are enough to tell you that if you're buying new, you want to get there before it's built, from a builder that's been in business for at least a generation, and have your own inspector riding their ass the whole way.
It is, it just usually results in immediate calls to actual plumbers without anyone else finding out. Or it’s hidden behind some new drywall and paint until a different occupant finds out.
This is a good comment
Hairdressers!
> should require some kind of genuine software engineering certification
Wouldn't change a thing, other than add another hassle you have to pay for to do your job.
This is the result of carelessness, not someone who didn't know that private data should be private because they weren't certified.
This is the result of somebody who has no idea how the fuck the tech they're using works. They surely knew it should be private, but they did not know that they were making it publicly available because they were blindly fumbling their way around in a job beyond their competence level. There is a 0% chance this was ordinary carelessness, in the form of "I know better but don't care enough", this is so clearly a case of "I don't know what I'm doing".
Any time someone tries to suggest certification as a solution I ask the same question: How would it have solved this problem?
Would the certification require someone to take an official certification test for the framework used?
And therefore we’re only allowed to use frameworks which have certification tests available?
If you want to write some new software, do you have to generate a certification for it and get that approved so people are allowed to use it?
Sounds like a great way to force us all to use Big Company approved software because they’re the only ones with pockets deep enough to play all of the certification games
The fact that you're thinking purely in frameworks is the exact problem that plagues the software industry. Framework-focused development is why we're in this mess; frameworks make it easy for people who don't understand how to program to publish shitty software by copying-and-pasting code and fudging around a few strings or variables to match their use case. That kind of accessibility is great for low-stakes software, letting anyone make interesting toys, but should be completely unacceptable in a professional environment with, for example, people's fucking tax documentation at stake.
If I had my way, the certification process starts at the bottom of the stack, ie. you should be expected to have a functional knowledge of assembly instructions, memory management, registers, the call stack, and build up from there. Not that we need to write assembly on a daily basis, but all of the abstractions are built on top of that, and you cannot realistically engineer secure software if you don't understand what is being abstracted away. If you do understand the things being abstracted away, you have the fundamentals necessary to do good work with any programming language or framework. Throw in another certification starting from networking fundamentals if your job involves that. 30 years ago, most professional programmers had this level of understanding as table stakes, so we can hardly say it's an unrealistic burden that's impossible to meet.
Would it be a higher barrier to entry that massively cuts the size of the field working on sensitive software and slows software development down, yes. That is exactly what we need. There was a time when people built bridges that collapsed, then we implemented standards and expected engineers to do real work to make sure that didn't happen. Is that work expensive and expertise-intensive, yes, do bridges still collapse, only very rarely. We are witnessing software bridge collapses on a weekly basis, which should be seen as completely unacceptable. The harm is less obvious than when everyone on a bridge dies, but I do think that routinely leaking millions of people's sensitive data is causing serious harm and likely does lead to people dying in second-order effects.
I follow your logic here, and it's certainly a coherent argument.
That said, there are perhaps some factors you are overlooking which matter.
The first is that no amount of certification solves the actual problem (which is that security mistakes are made, often in new and novel ways.)
Secondly the amount of software being needed (and produced) is immense. Bridges require engineers, but the demand for new bridges is tiny. The demand for new software is enormous, and the current rate of production requires many more people that could ever be certified.
In other words, say you only allowed comp-sci graduates with a proper 4 year degree, covering assembly upwards etc. The supply of programmers would drop to what colleges could produce. Which is not nearly enough.
The analogy also falls down a bit on penalty-for-failure, a collapsed bridge kills people, bugs in my notepad app might lead to information leaks? Thats not the same thing.
In truth, at least for the last 35 years, the number of unqualified developers exceed qualified ones by orders of magnitude. And there still seems to be no limit to software demand.
Finally there have been no studies I am aware if that suggest that security flaws are added more frequently by non comp-sci grads compared to comp-sci grads. Anecdotally I don't see that distinction myself. (From my observation security outcomes correlate to the degree to which the individual considers security to be important.)
And, of course, security issues are not limited to programmers- management has a role to play as well. Should they be certified too?
So, I'm not convinced that your suggestion, however desirable, would solve the problem. And since it's clearly unimplementable in the real world it's a moot argument anyway.
"Bridges" are shorthand. There is no shortage of need for new infrastructure. Any kind of construction needs engineers involved to ensure what's being built doesn't collapse from a gust of wind. Apparently, in the US, there seem to be about 1.5 million engineers and 4.5 million software developers. Well, I think in the short term, certifying only 1.5 million "software engineers" would be fine, actually. Note that my argument pertains only to sensitive software. If you want to make software that doesn't pose a danger to its users, you don't need an 'engineer'. This should have the second-order benefit of making PII toxic waste. If you need a real engineering team to process PII, companies that don't need PII will stop scraping every last fucking thing and leaking it. The majority of software in the world doesn't actually need PII to function, they could just be incentivized to stop hoarding it and use a regular "software development" team if they want to deliver cheap and fast.
I also wouldn't specifically associate this with college degrees. In fact I think universities are doing a shockingly bad job of producing functional software developers. But, on the other hand, you don't need a university to produce a good programmer. Software development is possibly the most open, information-available discipline in the world. Self-motivated learners can absolutely become competent on their own. The certification should be merit-based, and provide a clear path to learning the material the certification is based on. Many people will go through the effort to educate themselves and learn the required skills, especially if certified software engineers are in high demand and command a higher salary.
Regarding the penalty-for-failure, as I said, the harm is not as immediately apparent as when people die in a bridge collapse. But leaking sensitive information still leads to people dying, even if the connection is not as direct. Doxxing and blackmail frequently lead to suicide, and there are other damages that could lead to a butterfly effect culminating in a higher death rate, or, even if not death, tangible harm. This leak contained birth certificates, IDs, passports, tax documentation, passwords, all kinds of information that could be used to ruin someone's life with identity fraud. There is also, of course, some software in the world that is directly safety-critical, much of the software used in the health field for instance, which is also currently being written by the lowest bidder in many cases.
Regarding management, they don't need a certification but rather consequences for their actions. Currently the incentive structure is such that management is rewarded for cutting costs and is never punished for harming customers. Fiverr, for instance, should be facing an investigation that threatens to shut down the business given that not only did this happen in the first place, and not only did they ignore it for 40 days, but even after it went public the sensitive files were still accessible for 12+ hours (notably, after they were definitely made aware of it, given reports in this thread of people receiving replies from Fiverr about it). Maybe throw in some criminal liability for the people most responsible for a situation this horrible. Management would tighten up real quick.
I don't agree that this is unimplementable in the real world at all. If anything it's a complete abnormality that software development is the way it is, when most other skilled professions are licensed and regulated.
i have bad news for you
> Would the certification require someone to take an official certification test for the framework used?
> And therefore we’re only allowed to use frameworks which have certification tests available?
When it's safety-critical, yes, absolutely. A service that handles sensitive PII, such as the one whose "engineers" should be prosecuted for this incident, is definitionally safety-critical.
If you're afraid in that world you'd be unable to work, maybe you deserve to be.
How did original engineering certification prevent dangerous constructions? Did it force everyone to use a Big Company?
The certification obviously would have to have teeth. A certification that you needed in order to do work as a software professional, which could be revoked for cases of carelessness or negligence, would disincentivize carelessness and negligence.
This is how airline pilot certificates work. And in that career, certification actually works. It's not a miracle or unexplainable.
It's so much worse in the industry, the truth is that many people literally have no idea how to secure things, what to secure, why to secure it - they pay no attention and are plainly ignorant of the state of the world and oftentimes just stupid.
I worked at a company where a customer called confused because when they googled our company as they did every day to login to their portal they found that drivers licenses we stored were available on the public internet.
The devs literally didn't know about direct object access and thought obfuscation was enough, didn't know about how robots.txt worked, didn't know about google webmaster shit, didn't know about sitemaps, they were just the cheapest labor the company could find who could do the thing.
This is a huge portion of outsourced labor in my experience, not because they are worse overseas in any respect, but because the people looking for cheap labor were always looking for the cheapest labor and had no idea how that applied to the actual technical work of running their business.
>they were just the cheapest labor the company could find who could do the thing.
Thats the problem right there. The company doesn't care. No amount of personal certifications is going to fix that.
It MUST be on the companies. They should be fined out of existence for such breaches and they would quickly change tune.
> They should be fined out of existence for such breaches and they would quickly change tune.
Looks like this is a great opportunity for an object lesson. Let’s see how it goes…
As far as certification stuff…
Civil engineering has had licensing forever. That’s because Bad Things Happen, when they make mistakes.
I do think that it would be a good idea to score/certify critical infrastructure stuff. That might involve certification of the people that make it, but it should certainly involve penalties for the people responsible. That might include the authors, but it should probably also include the folks that decide to use the bad code.
I know that ISO 9000 is an attempt to address this kind of thing. In my opinion, it’s kind of a mess. I’ve worked in ISO 9000 shops, and it’s not much fun. The thing you learn, pretty quickly, is how to end-run the process, as it’s so heavy, that it basically stops all forward progress. It doesn’t have to, but often does.
Mistakes get made. If you design carefully, these mistakes won’t cause real damage.
I just figured out that an app I wrote, that’s been out for two years, has an embarrassing bug (mea culpa). I’ll get it fixed today.
Because I’m pretty careful, it doesn’t affect stuff like user privacy. It just introduces performance overhead, in one operation, so the fix will mean that the app will suddenly speed up.
I’m not sure that certification would have solved it. My security mindset is why user privacy wasn’t affected, and that comes from experience.
> Good judgment comes from experience. Experience comes from bad judgement.
Also if you are personally liable of gross negligence, you will:
1. Get paid more (as less fake "engineers" are available for the responsibility).
2. Push back harder (or at least document in detail) on malpractice during development. Manager did not listen to your warnings? Document it and when shit hits the fan, the manager gets the stick instead of you.
Hitting companies with monetary fines does not work. Hitting the employees with jail time will make sure they don't sign on dangerous or known problematic systems.
Manager not listening? Remind them they will face a trial if the issue does surface.
> Hitting companies with monetary fines does not work. Hitting the employees with jail time will make sure they don't sign on dangerous or known problematic systems.
What!? So, when you can't switch jobs because the market is bad or for any other reason, your choices are: 1) quit and lose the income (which you can't afford) or 2) sign on whatever and accept the risk of jail time?
>Wouldn't change a thing..
That's exactly what certification or licensure does; it imposes financial, civil, and criminal penalties for malpractice.
The liability of incurring penalties quickly outweigh the benefit of arbitraging costs with an unqualified practitioner.
I think just putting it on the companies is enough. If the fines are serious and can put your company out of business, and are enforced, then the companies themselves will probably work out processes for not doing stupid stuff. Whether that be creating some sort of certifications that would be prized by the companies, knowing to hire a specialized team for a security review, or anything else.
If everyone knows that messing up security gets you in real trouble and the company loses real money, and it happens all the time, and it's not just "Facebook fined $x million for doing shady stuff", then I think the industry will adapt.
Like when GDPR got released and no matter if I thought we are or are not handling PII, I had to read up and double-check my assumptions just because it was being talked about all over the place and it would be embarrassing to be caught with your pants down when you didn't actually intend to do a shady thing.
> I think just putting it on the companies is enough. If the fines are serious and can put your company out of business
They don't care. It's either never enough to make them care, or the company can just bankrupt and you go do something else.
If you or your manager has the threat of jail in the back of their mind, it's no longer just someone else's money being lost, it's personal.
> If everyone knows that messing up security gets you in real trouble and the company loses real money
There's already huge fines on paper for this, but never ever are the fines enough. It's always factored in the "cost of doing business". Also it's still someone else's money, why would an engineer care?
Please show me a GDPR fine that hit hard enough to scare companies into not fucking up? Evidently here it was not enough for Fiverr.
Edit: Just to provide an example, Takata airbags have been recalled massively (if you don't know why, look it up) but the company is now bankrupted and who is footing the bill? Their customers.
You cannot impose a fine on them, as it's bankrupt (now, but it was always the plan). They deliberately sold dangerous airbags and now what can you do so it doesn't happen again? Fine them some more? or maybe throw a few execs in jail because they knew of the problem and continued as usual.
People at my company don't even lock their computer when they walk away from their desk. Which yeah it's in a controlled environment but still.
My work has a “donuts” slack channel for this. You find an unlocked computer you post “donuts on me!” Social pressure says they buy the office donuts.
Still get a few a week, but at least it’s public and amusing.
This would be borderline illegal in most countries. Not very enforcable, sure, but illegal.
We used to flip display upside down in display options, which also reverses the mouse. We'd then lock the PC and disconnect the keyboard. After they figured out the keyboard had been pulled they often couldn't work out why their screen was upside down...
good thing it's getting easier to code - nothing bad can come of this :-)
some kind of genuine software engineering certification
That only gives those in power another way to push people into toeing the line. There's enough corporate authoritarianism these days as it is already. Give Stallman's "Right to Read" a read. His dystopia is exactly where we're going to be headed quickly if we keep demanding someone to "do something".
"The optimal amount of fraud is nonzero."
"Those who give up freedom for security deserve neither."
You're responding to literally 7 words out of context.
> Jobs with access to/control over millions of people's data should require some kind of genuine software engineering certification
FAANG, Fortune 500, etc., almost universally go out of their way to violate user freedom in pursuit of profit. Regulation is practically the only way to force megacorps to respect users' rights and improve their security, as evidenced by right-to-repair, surveillance/privacy, and so on.
And none of that has anything to do with users' individual rights to create, run, and modify their own software.
(Yes, regulatory capture exists, no, it doesn't mean all regulation is bad.)
If the megacorps are going in that direction of being strictly regulated, the rest of the industry will follow. It's the general movement of the Overton Window that's the underlying issue.
I once worked in a company and noticed that customer financial statements were publicly accessible. Ran into the software team. And got the reply that no one told them that it should be behind authentication. Some people really don't use their own brains.
If you do, you get into trouble with the hierarchy, all those middle-managers and responsibility distributing committees will be unemployed.
At my last job, I opened up Shodan in my free time and clicked through our ASN with the free filters. In two minutes I found multiple iDRACs online. Surprisingly, none had default pw. But one had a public exploit vuln that was years old allowing takeover...
Turns out during the firewall hardware migration years ago, several units firewalls were switched to audit mode (not enforcing rules). So an entire institute (health research!) had their whole subnet public with zero firewalls, both the server OS and iDRAC interfaces. iDRAC isn't even supposed to be on the same VLAN per Dell let alone on the internet.
To top it off, after making some tickets (admittedly not all as serious, ex MFP web UIs on internet) from Shodan, I got pushback from the firewall team for causing units to submit to many changes.
I also got in trouble with our Qualys analyst for undermining his work because he hadn't gotten to that units annual review yet, even though I didn't even have a Qualys login. (And even if I had found it there, since when do we wait for annual reviews to fix that?)
It took at least three weeks internally to get it fixed, and by that I mean only the iDRAC IP blocked with the server itself still wide open.
And that's only because I mentioned it to my manager (awesome guy and not formally responsible for firewall rules) after an unrelated no firewall host incident came through and he authorized an emergency rule.
Huawei Enterprise devices tend to have a CAPTCHA by default on their BMC/OOB GUIs or the other various system/infrastructure service GUIs (such as the HuaweiCloud/FusionCloud products). I'm guessing the reason is that people leave the management ports and GUIs wide open to the public Internet, so the CAPTCHA is protecting at least from the very basic script kiddie bots.
Wow, the other comments weren't exaggerating. This is really bad. If my tax returns or other data were part of this, I might consider legal action.
I wonder if somewhere like Wired/Ars Technica/404media might pick this up?
Thanks, tip lines were a good idea
https://fiverr-res.cloudinary.com/image/upload/f_pdf,q_auto/...
This is too funny
Personally, this is the funniest one to me. It turns out Fiverr uses cloudinary for their internal documents as well. (Note: this one is not confidential and is public information)
https://fiverr-res.cloudinary.com/image/upload/f_pdf,q_auto/...
I saw that too. Ddg didn't give me a lot of results. Beyond a few dozen
Shows you how much these certifications are worth in reality.
Absolutely worthless pieces of paper. We had the ISO 270001 and the physical security "walk tour" or whatever it's called; I could've outsourced that to a bunch of preschoolers walking around the offices and data center rooms and would've gotten the same result. The only _actually_ working way to protect your org is to continuously attack your own systems and see what part of it breaks or leaks data.
Clearly the real issue is their 27001 expired on 15/12/2025
> I wonder if somewhere like Wired/Ars Technica/404media might pick this up?
Might also want to add El Reg [1] to the list.
1: https://www.theregister.com/
I saw that this was also reported on r/Fiverr[0]. It looks like an almost verbatim copy of this. I don’t see much discussion (so far).
[0] https://www.reddit.com/r/Fiverr/comments/1slzoey/other_atten...
Company is now telling media this is intended behavior and users knew these files were public / shared the URLs themselves. We need to get some media with wider scope to challenge that.
And additionally a failure to handle a responsible disclosure.
You followed the correct reporting instructions.
https://www.fiverr.com/.well-known/security.txt only has "Contact: security@fiverr.com" and in their help pages they say "Fiverr operates a Bug Bounty program in collaboration with BugCrowd. If you discover a vulnerability, please reach out to security@fiverr.com to receive information about how to participate in our program."
The Cloudinary fix that nobody in this thread is naming is actually two lines. Upload the asset with type set to authenticated instead of the default upload type, and generate a signed URL server side with sign_url true whenever alogged in user requests it. Once the asset is authenticated the public URL stops resolving entirely, so even the Google indexed copies go cold. The reason Fiverr cannot just turn this on now is that they already have years of stored messages where every reference is the default public delivery type, and switching the existing media library from public to authenticated breaks every existing URL across the whole platform. That is the architectural brittleness someone upthread was pointing at, and it is also why the only realistic path forward for them is rotating new uploads to authenticated and accepting that the historical exposure is permanent. What would actually catch this category of mistake earlier, an SDK default that refused to upload anything as public unless you opt in?
It seems that someone sent a DMCA complaint months ago relating to this: https://lumendatabase.org/notices/53130362
> Recipient: Google LLC
This complaint was sent to Google, probably because the cloudinary.com URL appeared in their search results.
It's doubtful anyone at Fiverr was made aware of this - unless Google typically forwards these complaints to the actual host of the offending URL. Even then, it would go to Cloudinary who would in turn need to notify their client. Many hops with plenty of "someone else's problem" barriers for the message to overcome.
I wrote to security@fiverr.com and they just replied:
"You’re the second person to flag this issue to us
Please note that our records show no contact with Fiverr security regarding this matter ~40 days ago unlike the poster claims. We are currently working to resolve the situation"
So who has more incentive to lie, fiverr or OP?
Is this even a question? Obviously, the company that has publicly posted people's tax forms on the internet is very trustworthy and we should eagerly believe everything they say.
I don't think it even comes down to "lying". It's possible that they genuinely believe they didn't receive contact, but given that they are verifiably completely and totally incompetent and have no right to be employed in their current role, they've earned exactly zero benefit of doubt.
(weird to share any details about this incident to uninvolved parties via email anyway)
@janoelze -- that was my thought too, though less so that they wouldn't share a claim of not being notified at all with a third party, but more that those kind of things need to go through legal/comms/etc not whoever runs the security mailbox. if the person running the email box is not the CISO, surely they at least need the CISOs approval to say something beyond a thank you or followup questions? (and if they are the CISO, then they have bigger things to worry about then replying...)
Exactly, it doesn't have to be about them lying. It could simply be, that they let go or lost one of their engineers and that person knew why to do what and the next one didn't, accidentally exposing stuff.
I have uploaded the email here: https://gist.github.com/aidanbh/3da7cecb3e2496e5c5110b88f21b...
(technically, I guess that doesn't prove anything other than it is in my Sent folder? it has a message ID but I guess only the purelymail admin could confirm that)
In any event, this should never have required an outside reminder. The indexing issue may be something non obvious. But the core decision not to use signed/expiring URLs is nothing less than good old security by obscurity.
I've contacted fiverr before about obvious fraud being conducted through their platform, and they just sent me in endless loops of "open a ticket". "No, e-mail us about it." "No, e-mail us at our security contact about it." Crickets, and then a response saying to please open a ticket.
Basically, they aren't set up for anyone to actually contact them and expect a resolution.
oh I got that too, Sent an email, "Open a ticket" . Then I see in the support page that the email opened a ticket and it was marked as solved.
For sure their internal metrics are all green and solved tickets are on the rise.
I wouldn't be surprised if their email blocks all unusual TLDs like your .dev.
Sounds like a really bad strategy for a security email address...
How to shoot yourself in both feet 101
Gee, that response doesn't sound defensive at all.
Wow, surprised this isn't blowing up more. Leaking form 1040s is egregious, let alone getting them indexed by Google...
I want to believe that it's people keeping mum until it's fixed so that the leaked PII isn't spread more widely, minimize the risk of bad actors scraping it all.
Once the leak is plugged, I would hope that Fiverr gets absolutely raked over the coals, this is egregious.
I wouldn’t be surprised if someone wrote a script to pull all the sensitive PII and it’s already on the dark net hacking forums for identity thieves. Freeze your credit at all bureaus if you ever used this site.
> Freeze your credit at all bureaus if you ever used this site.
People should always have their credit frozen no matter what. It's free and only takes a few minutes to unfreeze when you need to apply for credit
Yes it's a little bit inconvenient but so is suddenly having a car, insurance, and several iPhones in your name when somebody steals your identity...
My Experian score once took a substantial hit, dropping from ~800 to ~700 or something like that. Didn’t change elsewhere. I applied for the report and realized a quite significant debt appeared on my report, and it wasn’t even in my name! It was under someone else’s name that bore zero similarity to mine, or anyone in my family. Reported to Experian and it got fixed after a week or two. Zero explanation for how it happened. These credit reporting agencies are a joke.
That's more of the fault of whichever institution gave the loan
Remember, if you use Google to access any of this “private” information, you’re a hacker and the state of Missouri might try to arrest you!
https://missouriindependent.com/2021/10/14/missouri-governor...
That's wild. Thousands of SSNs in there. Also a lot of Fiverr folks selling digital products and all their PDF courses are being returned for free in the search results.
really bad stuff in the results. very easy to find API tokens, penetration test reports, confidental PDFs, internal APIs. Fiverr needs to immediately block all static asset access until this is resolved. business continuity should not be a concern here.
lots of admin credentials too, which have probably never been changed
admin passwords to dating sites, that's the stuff people get blackmailed with
How does someone's dating site password end up in Fiverr?
it's worse than you think – it's an admin password to the ~whole site~
Oh my. I feel for the tech team at fiverr. I'm sure it's nasty in there. Sending virtual hugs.
They have a dating site password! They can get real hugs.
Personally I have more sympathy for the people who were screwed over by the incompetence of at least some of that tech team
Meanwhile, I hope they get sent to prison for being so cavalier with other people's PII.
How does an admin password to the whole site end up on Fiverr?
There are lots of passwords there (though one wonder if they were rotated). Basically, the people doing the hiring are sending PDFs with their credentials to the contractors to do the job.
Answer from Fiverr:
"To be clear, this is not a cyber incident. Fiverr does not proactively expose users' private information. The content in question was shared by users in the normal course of marketplace activity to showcase work samples, under agreements and approvals between buyers and sellers. This type of content requires the buyer's consent before it can be uploaded. As always, any request to remove content is handled promptly by our team."
https://x.com/fiverr/status/2044389801495773339?s=20
I also commented this, but I have to say their statement is false. The links to all the delivered projects are publicly accessible. I went over my orders and I could open every single one from another device, not logged in.
this is because copied the link with the token, token is generated for your logged in user. strip the token and it wouldn't work
I tried to alert other freelancers on Fiver Forums about htis, but my post got deleted on for 'violating community rules'. I don't see how it did violate the rules, suspicious to say the least.
@dang example query feels incredibly doxxy, and feels bad form to link directly to full copies of people's [stuff] and [personal info] as seen on this page :/
I know this is all Fiverr's fault for allegedly missing the responsible disclosure but now is this the ideal way for us to discuss, with these particular examples? I ask not to spare Fiverr, but I would be so mad if I were first for the result in OP or my personal info linked directly...
@dang I agree that some/many of these links should not be posted here.
If this gets swept under the rug, it doesn't seem like they are going to do anything about it, and it will mean that only the bad people are going to be able to find this stuff.. who knows for how long.
@dang doesn't work write an email if you mean your comment in a non performative way.
Oh? I've seen him respond to that many times in the past...assumed he had some hook for those.
Correlation vs causation?
Well, maybe.
Maybe just does a search for it on some discussions here and finds them that way.
No, he just reads HN like we do
Sometimes users email him links to them
I stopped doing that after one guy said “why shouldn’t I use @dang when you’ll just send an email for me”
If you want dang to see your comment and reply (and remember it’s dang/tomhow now), email a link to your comment to the mods using the footer contact link along with a note
The Algolia-linkin’ king ain’t doing a Regex->notification thing, OK! Thank you
There are health stuff too... and they are not even paying attention to this matter
https://fiverr-res.cloudinary.com/image/upload/f_pdf,q_auto/...
Someone got an antivaxx certificate from Fiverr, not sure if that counts as delicate health information
it's been 5 hours. even manual action to take down the most sensitive files should have completed about 3 hours ago at most. what is happening.
Nothing- they are just hoping this will blow over.
Do I have to start emailing the people in the leaked documents with screenshots?
Leaving a paper trail of you having accessed unauthorized private info is a bad idea, some crazy lawyer could decide to include you in a suit. Just not worth the hassle. Email a tip line about the general situation.
Good call!
Not to forget some eager prosecutors. They can still try to prosecute for accessing material even if they end up losing. Lot of hassle there.
How is it unauthorised if it's freely available via a search without having to bypass any login?
It'd be like putting up an advert and then trying to sue anyone who sees it.
Some crazy lawyer included my parents in a traffic death suit’s defendants while they were victims who had their car badly damaged when the reckless driver rammed into two cars (including my parents’) and two pedestrians. The question isn’t whether you’re at fault, it’s whether you want to risk getting a court summons.
I'm confused about what you're saying - are you saying that your parents risked getting a court summons though they weren't at fault?
Surely the entire point of the court system is to determine who, if anyone, is at fault.
They didn't get a court summons but the court did call and send the plaintiff's filing. They were clearly not in the wrong in that case but it was still a hassle and quite a confusion. The point is people can sue you even if it's BS and you still need to respond.
In this case leaving a paper trail of having accessed unauthorized confidential information looks a lot like being in the wrong, so the potential hassle is a lot higher. You can argue it's not unauthorized after all, and you'll likely win, but you may need to expand time and energy arguing in the first place. And it could be significant.
Edit: In addition, (a) accidentally opening a confidential document -> oops, close immediately; and (b) taking a screenshot could be different legally (NAL yada yada), doing the latter could make it a lot harder to defend yourself.
HN saves
Determining who is at fault involves extreme annoyance and inconvenience for those who had fingers pointed at them, regardless of whether or not they were actually involved. If you involve yourself willingly, you're inviting that on yourself.
Admit no fault, ignore the criticism, keep doing the same thing, receive no consequences. I wonder how the folks at Fiverr's Tel Aviv HQ learned this strategy.
I've never been in the position that I've had to deal with this. Is the best you can do in this situation to pull the files and optionally republish them to a robots.txt'd path (with authn/z, too)? I can't imagine you can get it pulled from search engines very quickly...
There's a way to submit a request with Google ticket content taken down and then the easiest way would be probably to do a no index in the header response for future content
My guess is that if they take down the public hosting, most clients would lose access to work they paid for and fiverr has no way to put these back behind an authorisation. It is just a public list of files, either everyone has access to your file, or do not, including you.
But it seems like they might've just pulled the plug for everyone. I cannot access images from a seller. They throw a 404 as well.
My guess is that there is literally no one there who knows how to fix this. Seriously, look through the proposed solutions here, plenty of devs wouldn't know how to do any of them. It might not even be possible, with their architecture, to fix it quickly and retain functionality. I have worked in a place full of noobs where I'm certain none of the devs including me would have the first idea how to fix something like this.
Still publicly available.
Just forwarded this post to a few members of Congress.
In spite of how the pollies sell it, regulation is the friend of anyone earning less than one million dollars per year. Regulation would fix this. Get on it.
Sorry, didn't make it clear. If a company does not publicly respond to a privately made security breach allegation, they are in breach of their responsibility to maintain business. If the breach is rubbish, the alleger looks bad.
The best to you.
Probably not in scope but maybe https://bugcrowd.com/engagements/cloudinary will care?
This is bad.
They probably wouldn't act immediately as there's no way for them to enable signing without breaking their client's site. The only cleanup you could do without that would be having google pull that subdomain I guess?
(Fiverr itself uses Bugcrowd but is private, having to first email their SOC as I did.)
I guess they used Fiverr for security
Now being reported more widely, linked back to OP, eg:
https://cyberinsider.com/fiverr-exposes-sensitive-data-via-p...
https://cybernews.com/security/fiverr-leak-exposes-user-ids-...
The files are deleted now, that was fun while it lasted!
It's been 10 hours and all the links in this comment section still work...
If you use Google AI mode, it'll be happy to find you any kind of PII information.
I've been boycotting Fiverr, so I'm glad I'm not caught up in this. And judging by their response to this issue, I'm glad I've been boycotting it.
I've never tried their platform, but I once made an account on Upwork and it is absolutely ridiculous. I'm sure they are very similar.
People are asking for AWS help and giving root passwords to random contractors. A lot of people asking for CPA letters for loans and help with tax problems but their budget is under $100. And outright fraud posts are often seen asking for people to open bank accounts or otherwise bypass KYC.
Upwork now has an AI feature to help write job posts, so all the time you can see things like "If you want to attract freelancers like X, I can change it." So now the job posts are all written like corporate ones talking about "highly experienced in X" but pay almost nothing. Half the time the clients don't even know the words in their own post. And it charges every time someone applies to a job and then more to boost to top of list because every job gets 30+ applications supposedly.
Upwork struck me as a straight up scam or pyramid scheme or something. Total turnoff.
I think the AI thing also transforms functional requirements into technical requirements, often incorrectly.
"I want to build a website" gets converted into:
"I'm looking for someone with 5+ years of 'hands on' React experience"
From what I’ve seen, this always ends in some small fine/settlement and “no admission of guilt”. This type of protection is the source of these mishaps.
Wow this is really really bad. Insane this hasn't been fixed yet, media outlets are going to have a fun time with this story
This is crazy! So many tax and other financial forms out in the open. But the most interesting file I’ve seen so far seems to be a book draft titled “HOOD NIGGA AFFIRMATIONS: A Collection of Affirming Anecdotes for Hood Niggas Everywhere”. I made it to page 27 out of 63.
I've read worse. Better than Dan Brown!
that bar is subterranean, haha
Link please :pray:
Now returns Null for me, but looks like it was https://fiverr-res.cloudinary.com/image/upload/f_pdf,q_auto/...
Also, a version of this appears to be currently sold on Amazon for $15 USD.
I found someone's manuscript, at first I thought it would be scandalous to find it ghost written, but it actually is just annotations and someone proof reading it, the annotations come up in the PDF
I found the author on Amazon and the book still hasn't been released
this is sad
I dunno, page 27 is where it started getting good. I actually have to admit I like this guy's relentless positivity and he actually spent real money to pay someone via Fiverr to typeset it, edit it, etc. for him.
after reading it, it's super positive and really great. I wouldn't consider myself the target audience for this, but ill probably work it into my morning practice a little for a couple weeks.
Don't know how you stumbled on this (or what you searched for) but this book is gold, has me laughing out loud:
> "I wake up happy, dancing all in the mirror and shit. My confidence is so high I’m practicing how to accept a job I haven’t even applied for yet."
How big of a client is Fiverr? Surely Cloudinary would have alerts for an enterprise client leaking stuff?
Just insane
Literally the first featured client on their landing page. Amateurs all the way down.
I was scammed on Fiverr myself, so I may be biased, but this feels consistent with the platform incentives I saw firsthand. The dispute process did not seem designed to deal well with coordinated abuse, and weak controls around sensitive files would point to the same broader issue: user safety and data handling do not appear to be high priorities.
For anybody who missed OPs bit at the end:
Responsible Disclosure Note -- 40 days have passed since this was notified to the designated vulnerability email (security@fiverr.com). The security team did not reply.
Woah that's brutal all the important information is wild in public
I tried posting a warning to /r/fiverr but the admins removed the post. And the files are STILL public...how in the world is "sitting it out" their course of action?
Edit: I'm beginning to wonder if they might be locked out of their own site at this point. How hard could it be to just shut down the asset server until they get it sorted?
The ironic thing is, since they clearly don't have much code review, they could have actually patched the site in this time! Turn on signatures and throw in a couple backend lines to generate one wherever the URLs appear. Even if you have to go back and redo it tomorrow for robust security or performance, it would be an improvement over this.
I'm not taking sides either way, but if you are of the all in on AI perspective as they are, shouldn't this be the ideal use case? It absolutely could have handled adding URL signing.
If the assets are public and not associated to my account, how could they ever restore access if they made them inaccessible?
Fiverr probably hired someone from Fiverr to do the web build security.
Files are now returning 404s as of right now, 0900 UTC 4/15.
Would be interesting if someone with an account can check if they are visible to intended users or not, and if so, if their mitigation is robust (signed URLs?).
I have an account and I can confirm that the URL's of shared files are still publicly accessible. Google is giving 404's indeed.
Interesting. Did the URL scheme change with any expiry or signature params (like S3s X-Amz-Expires)?
Obvious! They dont care about freelancers
Looks like the cloudinary links are returning 404 now
This is really bad, just straight up people's income, SSN and worse just right there in the search results on Brave Search even.
The stock hasn't moved. Puts?
I don't really see what the issue is here? Someone with access to the URL (either the freelancer or the client) leaked it to Google, Google crawled it. How is this Fiverr's fault? I mean, okay, they could sign URLs, but it's not like they left a folder with directory listing on. Someone with access to the URLs, not them, willingly made the URLs public.
Users should have to be authenticated to access the resource, not just rely on the URL being a secret.
this is a bad leak, appreciate the attempts at disclosure before this
They bought and.co and then dropped it. strange company
Just by scrolling over it that's really rough.
have you contacted the ftc if their regulation is being violated?
Yet another in the innumerate examples of privacy statements being completely meaningless. I do think it is important to call this out when it happens, but I'm hardly surprised.
Given the existing DMCA requests and the fact that Google has become way less aggressive about indexing this stuff, it's clear this has been going on for a while. My guess is they've gutted enough of their internal processes that they literally can't restrict access to these files without breaking their own platform.
You really can't make this shit up: https://www.linkedin.com/feed/update/urn:li:activity:7445526...
The real question is: will Fiverr be the first company to truly crash and burn from an "AI-first" approach? Go LLM, go mayhem!
> will Fiverr be the first company to truly crash and burn from an "AI-first" approach?
No. Nobody will care.
Loooool what a mess
I just looked at the google search results... Holly cow... it is bad bad bad
Bruh this stuff is still public
Burn it to the ground.
I don't get why disclosing is considered acceptable, it seems like racketeering to me, "pay up or else I'll make this hypothetical issue an actual issue for you"
When I reported an issue and gotten no response, I sat on it for 6 years, reported it again and they took the whole site down without reaching out to me, never quite got it, but if people are doing this, it makes sense not to acknowledge any report and just play deaf.
What’s hypothetical? All this is and has been publicly accessible.
Have bad actors already found it? Who knows?
So if Fiverr isn’t going to fix it then the next best thing is to warn people.
Huh? I didn't ask for any money here or in the original email. (not that I couldn't use some, as I only have $1000 and four heavy suitcases right now, but anyway...)
I did include "bug bounty" in the email subject since they claimed to have a private program. Other than that, no mention of any kind of compensation. It probably doesn't even have any kind of resume value since it's not an actual code flaw/CVE, just an "unlocked door."
> Moreover, it seems like they may be serving public HTML somewhere that links to these files. As a result, hundreds are in Google search results, many containing PII
This is not how Google works.
It kind of is, though. Google doesn't randomly try to visit every URL on the internet. It follows links. Therefore, for these files to be indexed by Google, they need to be linked to from somewhere.
Exactly , that's whyb"non public" github gists work. They are public, but not indexed anywhere "by default "
Good thing, otherwise they would have exposed countless photos via Google Photos.
Today, a photo file might be hosted at:
But it used to be a little closer to:
And no auth required, URL only!
> Therefore, for these files to be indexed by Google, they need to be linked to from somewhere.
So? That’s indeed how Google works.
Google does not work how OP describes it.
I’ve investigated similar incidents in the past on other platforms, it was always user error causing links to be public.
The only thing that's user error here is the developers of Fiverr exposing files without proper session authentication.
That’s very often a deliberate design decision.
It’s bizarre UX if you link a file to someone and the link doesn’t work.
It's actually very common to link a file hosted in the cloud to a coworker or partner and it requires login.
Can you actually explain why the phrase you cited from OP is wrong? You say that ~”files need to be linked to from somewhere” is correct. How is a file linked to from somewhere [on the internet] if it’s not being served on the internet that Google crawls (ie, HTML)? The only alternative is in… API calls? That Google probably isn’t crawling?
“Fiverr might be hosting public HTML somewhere” seems like an entirely reasonable alternative phrase to “these links must be linked from somewhere [that Google can crawl] “, at least to someone who is only superficially familiar with how search works.
The distinction you imply is obvious is not, and your point is thus rather confusing to someone who is not you.
It’s a huge mistake to assume these links have to originate from fiverr-hosted HTML, it’s far more likely Google is finding them from places like GitHub repos used by fiverr-users.
That was my first thought, but is it logical to assume that 5+ unrelated people took their finished tax return URL and linked it on a website/tweet/etc? Who would do that?
Even still, Fiverr could very well have GDPR/CCPA/etc liability as the host of these files, because they related to its services, it's not just a generic file host.
It's exactly how it works, pages don't just magically appear in Google's index.
You need links to pages either from your own website or backlinks from other websites. Alternatively if the page is in your sitemap then Google will typically pick it up or you can manually submit it for indexing. For important pages you would typically want internal links, backlinks, and have it in your sitemap.
Google indexes links from places other than fiverr, odds are these links are mostly from places like GitHub.