Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ah yes, the time honoured tradition of improving your integrity department by removing transparency.


Their work is just so objectively good and correct that allowing insight into what they're doing can only harm the work they're doing.


Your statement reminded me of something I read during my undergraduate courses.

An excerpt of Abraham Lincoln’s seminal work of popular wisdom, “Boom! Goes the Capitalism”:

“Schrödinger's Policy is a seemingly paradoxical thought experiment devised by Erwin Schrödinger—-after a late Wednesday night of strong ale and copious amounts of poutine—-that attempts to illustrate the incompleteness of an early interpretation of quantum mechanics when going from subatomic to macroscopic systems. Schrödinger proposed his "policy" after debates with Albert Einstein over the Copenhagen interpretation, which Schrödinger defended, stating in essence that if a scenario existed where a policy making body within an (ostensibly) benevolent tech company could be so isolated from external interference (decoherence), the state of the group’s work can only be known as a superposition (combination) of possible rest states (eigenstates), because finding out (measuring the state) cannot be done without the observer interfering with the experiment — the measurement system (the observer) is entangled with the experiment.

Einstein reportedly suggested that tech companies could never be benevolent, especially when they are expected to maintain an objective view when their finances were concerned.

After an unknown number of rounds of distilled spirits derived from wormwood, Schrödinger conceded this was true and instead recast his thought experiment to feature a cat as the primary actor; a creature which he felt had a vastly higher likability.”

/hat-tip to the editors of the Schrödinger Wikipedia page


For those without a sense of humor, this is sarcasm.


I think it transcends sarcasm, circling back to how they actually feel.


your sarcasm – our oboros


I see what you did there.


Some cultures does not have sarcasm, while having a sense of humor.

And they probably also have forms of humor that you would not understand. Which does not imply that you don't have a sense of humor.


like talking loud (translates to ALL-CAPS) is considered humorous

the louder – the funnier


Really? Thanks!


Ask Apple how that turned out for them by being completely hermatically sealed from the public. Zero transparency.

This is a low quality knee-jerk emotional comment tickling the current FB zeitgeist and HN tends to love those.


Apple isn't even remotely in the same position as Facebook.

Let alone - Apple is private, Facebook is anything but private


Apple on Apple ads : "Want relevant ads?"

Apple on Facebook ads : "Want spyware?"


Given that their Chief Integrity Officer is Guy Rosen, someone known for running Onavo which was basically spyware... I question a lot the cognitive dissonance that FB employees might go through to believe there is any kind of integrity taken seriously in that fucking company.


special circumstances. it's a democratic grey area.


Sometimes transparency encourages bad actors.


No one owes Facebook the benefit of that doubt.


The point is that protecting your own privacy should never been used as an indication of guilt.

"If you've done nothing wrong, you have nothing to hide" is the lie of the authoritarian. It is naive to think that a bad actor cannot construct a falsehood with a collection of truths.

I have worked on court cases for both individuals and (small) businesses that were innocent, but pissed off the wrong government agency, and got pushed/intimidated into pleading guilty.

Judges are flawed human beings who rule against innocent people and companies ALL THE TIME because court cases are a flurry of biased witness accounts and misleading/incomplete/subjective evidence. Folks in HN should appreciate this most because as I work in IT forensics, the judges appreciation for the technology details of cases is usually very low and they make mistakes. Defense attorneys know that and so they encourage innocent people to strike plead (guilty) deal, even when they have good evidence of their innocence because literally anything can happen in a court room. Prosecutions also know this, and so they throw absolutely everything they can think of in front of the judge, from hearsay to irrelevant data points to try to influence the judge's overall impression of the defendant so the judge will get frustrated with the complex details and rule on other subjective evidence. It's really an eye-opening experience to see people getting railroaded in court.

Few appreciate it until it happens to them, and they are presented with a subset of facts and hearsay that the prosecution has architected to paint a misleading picture in front of the court.

That is just the unfortunate reality of the world. Protecting your own or your company's privacy is the prudent way to behave even when you are completely innocent.


No one things FB is evil because they aren't transparent. It is because of the observable harm on society and their perceived reluctance to mitigate that harm. Internally, they also suffer with retention because employees don't want to work at an evil or perceived to be evil place.

Thus, FB decides to have more transparency, gets a little credit that they want to fix the problem. All good.

Now, FB then revokes that after it leads to criticism (which is the point of being transparent...), so they revert to the previous state, only now there is even more evidence and the problem is worse.

It's the height of irony that a company that profits from selling information about you would ever try to defend itself with a right to privacy. Not only is that not the issue at hand but it speaks to the lack of self-awareness that I can only hope will doom FB to irrelevance or legal destruction.


I am not arguing Facebook's goodness or evilness.

> It's the height of irony that a company that profits from selling information about you would ever try to defend itself with a right to privacy.

Everyone deserves a robust defense - even the guilty. It is not for their justice - it is to protect the integrity of the system.

Everyone also deserves the right to privacy because bad actors (competitors, disgruntled employees, political activists, etc...) are always happy to twist facts leaked to make a biased case. This happens all the time - there is no shortage of talking heads on spin alley.

You're right in that FB should have probably never tried to be more transparent - and doing so was naive and destined for failure. In the court of public opinion / social media (and eventually probably ACTUAL court someday), you simply cannot successfully voice your innocence. Even apologizing when you're wrong makes things worse.

My advice to clients is always...

#1. Shut up. Do not make public statements. Do not engage anyone publicly. Anything you say can and will be used against you - both in and out of court.

#2. See point above.

In my opinion, the correct strategy for a social media company these days is to be as ethical as possible, and to say as little as possible publicly about ethics or internal policies.


Your lecture on jurisprudence and standards of evidence is misplaced. This isn't a courtroom. We are not judges presiding over a case. These are matters of politics and profession as much as law and here we are free to interpret behavior, state our views and consider our associations as our knowledge and experience inform us. Further, unlike Facebook which has long since squandered whatever good faith one might imagine they were owed, we are owed the assumption of good faith while doing it.


It applies even more outside of a courtroom. Most people get information via very agenda-driven news articles, and not having "making sense of weird situations" as their day job, they're even more likely to be misled than judges.

At least a court has lots of rules built in to try and provide some amount of fairness to both sides in a debate. The political and public sphere has no such rules, making it even more vulnerable to such railroading.


"It applies even more outside of a courtroom."

Perhaps inside the heads of some. In the actual external world people make decisions based on incomplete information, first impression, secondhand observation, self serving bias and other imperfect influences. Neither you nor I will live to see this reality 'corrected.'


Yes, that's why corporations, even the most perfectly good ones in the world, still value their ability to keep things private.


That is also why, when one witnesses a corporation hide that which was previously observable, we may suspect malice.


Transparency and privacy aren't antonyms.


Pretending that transparency always has only good effects is clearly nonsense; but I think that pretending that the lack of transparency ever has good effects is suspect at best.


Indeed. I've always been deeply influenced by the old saying:

Your Character Is What You Do When Nobody Is Watching

Your chances of having a problem from stuff being leaked is greatly reduced if you aren't doing shitty things to begin with.

Not that I think that's what motivated this "whistleblower".


Guilt should not be inferred by a person or company protecting its own privacy.

It is naive to think that a bad actor cannot craft a falsehood from a subset of truths they select that you've chosen to make public.


When there are motivated leakers from both sides of the political spectrum it’s impossible to “not do shitty things”; someone will always be disappointed: https://www.foxnews.com/media/facebook-whistleblower-reveals...


Diplomats have long argued that certain concessions can only be made if the specifics of their negotiations are kept private.


Since transparency is the opposite of lack of transparency, I don't understand how transparency can have bad effects without the opposite (lack of transparency) having the opposite (good) effect.

It can't both be the case that transparency ever has bad effects and lack of transparency never has good effects: pick one of the bad effects of transparency and introduce "lack of transparency" in its place. Isn't that bad effect now gone? Isn't that a good effect of the lack of transparency?


I think it's a question of whether we're playing a single round of a game or an iterated game; or, to phrase it differently, whether we're discussing single acts or culture.

Granted, any time that transparency leads to bad effects, lack of transparency would presumably have led to the lack of those bad effects, and so be good, at least indirectly. A one-time lack of transparency can perhaps be appropriate.

However, a culture of lack of transparency is not the opposite of a culture of transparency, and those are the two poles I really see us discussing. A culture of transparency can lead to problems, but can also lead to good things (or at least to the exposure of bad things so that they can be addressed); whereas a culture of lack of transparency always will lead to problems, and any temporary good results from its imposition will be overwhelmed by its long-term ill effects.


It is also a question of whether the transparency allows dissenting views. Often, it doesn't.

Which isn't to claim that all dissenting views are good. But when you are an industry that is having a hard time with minority views, forcing the minority to take center/public stage at all times is actually somewhat counter productive.

Put differently, if you are going to encourage transparency for communication and decisions, you have to take extra care to not punish good faith mistakes.


The whole point of 1:1 is that some privacy allows people to say things they would not otherwise.

So yes, lack of transparency can have good effect.


I want to believe this. I also believe that 1:1s is a bit of an aggressive tactic to force someone to talk when they may not want to. :(


It's a hard problem. Some people don't want to talk for good, well-founded reasons; others (and I'm often guilty of this myself) don't want to talk because they just instinctively flinch away from hard but necessary discussions.


1:1 is a tool that requires a certain level of skill to be used successfully. it can be used for many purposes and in many ways.

if you're describing your past experiences, you should definitely level that skill, especially if it makes you uncomfortable.


Well said.


Is there an example you have in mind?


Yep: https://www.amazon.com/Google-Leaks-Whistleblowers-Expos%C3%...

I'll save you the read. A Google employee downloaded 100k documents from the company-wide available internal wiki, identified all the ones that seemed anywhere from highly-to-mildly offensive to anyone in the world, and then published it all as a book on Amazon. The book has been out for about a year and continues to sell well (not to mention the hundreds of paid media appearances for the author to speak in front of like-minded audiences).

Here's the fun part: Google considered suing, but recognized that this would trigger the Streisand Effect and let him off the hook. Obviously, this was a calculated risk this dude took and that paid off (at least from the legal perspective).

Did Google act ethically 100% of the time? God no! But did it hurt them to have everything out in the open? You betcha.

So clearly there's a market out there for juicy internal documents. I would still say the first step is not to do anything wrong, ever. But once you have enough people working for you and you can't be in every meeting at all times, shit is going to happen, and people are going to want to pay to find out about it.

EDIT: replaced "filtered out" with "identified"


>>> Sometimes transparency encourages bad actors.

> I'll save you the read. A Google employee downloaded 100k documents from the company-wide available internal wiki, identified all the ones that seemed anywhere from highly-to-mildly offensive to anyone in the world, and then published it all as a book on Amazon. The book has been out for about a year and continues to sell well (not to mention the hundreds of paid media appearances for the author to speak in front of like-minded audiences).

You also have to keep in mind that a "company-wide available internal wiki" is going to be almost entirely full of boring crap that people literally have to be paid to read.

I mean, the internal wiki pages for nearly every system I've worked on in my company don't even describe what that system is supposed to do in terms someone from outside the team would understand.

> So clearly there's a market out there for juicy internal documents. I would still say the first step is not to do anything wrong, ever. But once you have enough people working for you and you can't be in every meeting at all times, shit is going to happen, and people are going to want to pay to find out about it.

It's not a "bad actor" thing to leak things like that. "Shit" that happens at low levels is still shit, even if it's not company policy coming from the top.


Google a decade ago wasn't like that. I learned so much about software development from reading internal wiki pages for things like MapReduce, GFS, Colossus, their search serving system, search algorithms in general, their authentication/ID system, GMail storage, Google Reader feed ingestion, Google Moderator voting algorithms, etc. Not to mention snooping on code reviews from folks like Jeff Dean, Rob Pike, and Guido van Rossum.

It's not like that now - things are so much more locked down. The first time I was there I could basically treat it like a Ph.D that I got paid for, in terms of learning new things. This time, it's a job where I do my tasks and receive lots of money in return. I think this is kind of a shame - I liked the environment much more when most people were there to learn and invent rather than collect a paycheck - but perhaps this is the inevitable result of a company growing past 100K employees.


>"Shit" that happens at low levels is still shit, even if it's not company policy coming from the top.

In some cases, it can be worse. If something happens at a low level that is unambiguously bad and no one at the top does anything about it despite knowing about it, then to the public, it's as if upper-management is condoning the behavior.

Case in point: the Redskins practicing human trafficking with no strong public denouncement from the team's front office.


> Here's the fun part: Google considered suing, but recognized that this would trigger the Streisand Effect and let him off the hook.

Well, their strategy worked because I’ve never heard about this book.


> But did it hurt them to have everything out in the open? You betcha.

I hear this often and find it a bit puzzling. Perhaps it's part of believing that speech is violence - thinking that critical speech or angry feelings are really hurting Google.

So how do you think it hurt them? Did they lose customers, and if so, how many? Did they miss a quarter? Have to lay people off? Cancel product initiatives?


If any of these calamities eventually happen, we'll have to blame GP post.


Honestly this doesn't happen if you have a good corporate culture to begin with (which Google has not had for quite a while).

I always go back to my experience at HP when Bill and Dave were still alive: the leadership decision to trust employees was pretty much reciprocated so openness worked just fine.

So lack of leadership skill (especially an inability to "lead by example") is usually what causes this type of situation.


This is folly because it only takes ONE person to leak documents, and there are always disgruntled employees somewhere in a company at the scale we are talking about.

You cannot assume good behavior of literally tens of thousands of employees because you have an overall "good" company culture.

That's just impossible.


Best Sellers Rank: #48,190 in Books (See Top 100 in Books)

#2 in Tribology Mechanical Engineering

#4 in Content Management

#10 in Online Internet Searching

what the hell


Is your argument that the world would be a better place if those wiki pages weren’t public?


Not at all. Most companies are not in the business of making the world a better place, and therefore they are not going to be optimizing their info access policies against that target, but instead they will be optimizing for shareholder value or what have you.

Don't confuse this statement with an endorsement - I wish it all worked differently, but this is the reality.


Most companies put a lot of money and reputation on the line advertising and taking stances on current events in such a way to imply that they are "in the business of making the world a better place". Either they need to step up and practice what they preach, or a regulatory agency like the FTC should step in when they make false claims about corporate governance.


I think this comment is giving an (interesting, IMO) example of the claim "Sometimes transparency encourages bad actors" and not making a value judgment on whether it is worth the tradeoff to limit internal transparency just to have stopped this one bad actor.


Interesting! I'm going to have to get a hard-copy of this book.


If it's the same stuff on his website it is incredibly underwhelming https://www.zachvorhies.com/google_leaks/


> But did it hurt them to have everything out in the open? You betcha.

Good. Google should not be able to hide anything from anyone, especially the public whose privacy they invade. The more leaks, the better off we are. Especially the kind of leaks that hurt them. Those are the ones we want most. If they didn't want us to find out, maybe they shouldn't have been doing it, right?


So they should make all their code open source? Have no intellectual property rights?

You are suggesting they just go bankrupt.


> You are suggesting they just go bankrupt.

I'm suggesting they stop surveilling the world's entire population. Until they stop violating our privacy, I won't feel sorry for them when others violate theirs. Total lack of privacy is exactly what these big tech companies deserve.


"Paid media appearances" - how would you know he got paid?

I'm not on his level, but I have a book and I've done "media appearances" (BBC stations in Berkshire & Bristol). You don't get paid for promoting your book.

Maybe on a TV show, the union forces them to pay you the minimum, which I believe is not very much.


This book is only ~200 pages long. What am I missing?


When the GP says "filtered out" the offensive ones, I think he means he filtered out the inoffensive ones and kept the offensive ones, which makes for a much shorter book.


Yep, sorry for the logic brain fart. Updated the parent comment accordingly.


Google used to (c. 2005-2008) let you freely download your contacts & email from GMail, either as a file or an API. Facebook used this to seed Facebook. When you signed up with an @gmail.com account for Facebook, it'd prompt you to import all your contacts, and then send them all an e-mail inviting them to sign up for Facebook.

Google's requests for some reciprocity from Facebook, where users would own their own social graph through technologies like OpenSocial, naturally fell on deaf ears. Google walled off their own platform and decided to abandon open standards for proprietary systems a couple years later, and this is a large reason why we can't have nice things.

I believe GDPR enforces the ability to download all your data and both Google and Facebook offer it, but it's practically useless now. GDPR also makes the regulatory burden of starting a new social network impractical, so it's not like we could get an ecosystem of interoperable competitors.


> I believe GDPR enforces the ability to download all your data

Only data you have provided, both directly (entered name) and indirectly (e.g. page views). Analyses generated by the company based on that data aren't included in the insight right. I would presume both facebook and google have a lot of data that fall in the exempt category.


Not sure if this applies but the Obama administration has spoken previously about how releasing/publishing their visitor list was something they regretted. They got mild accolades at the time of doing it (in a move to be more transparent that other administrations) and then the right wing (mainly) used the visitor log to make hay for 8 years. Then the Trump admin stopped releasing the info and no one made a big deal about it (relative to the number of stories written about who showed up on the lists the Obama admin released).


That FB whistleblower with PR and legal team


Being that Whistleblower not having legal and PR teams would have been rather stupid.


She is not whistleblower, she is a front


Even if this was true, should I care? If the leaked documents are real then it doesn't matter.


For who?


Limited Hangout https://en.wikipedia.org/wiki/Limited_hangout

Not necessarily saying that is the case but if you wanted to steer regulation in a direction....


FB saying "please oh please don't regulate social networks!" sure would seem like Br'er Rabbit. Most prospective regulations in this area would have a bigger impact on e.g. a completely open community-driven network, for example.


Interesting. I was thinking more in line of ppl simply siphoning money and power from fb. I don’t see a grand scheme here, just some blood in the water.


Are you suggesting in a roundabout way that we should get rid of all the things that can encourage bad actors?


Facebook's existence encourages bad actors.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: