>Not long after obtaining his doctorate, he received an inheritance from his grandfather
>“How could it be the case that this person is cofounder and CEO since 2005 and the company still exists?” The answer dates back to Karp’s decades-long friendship with Peter Thiel,
>Thiel had cofounded PayPal and sold it to eBay in October 2002 for $1.5 billion.
>Enter Karp, whose Krameresque brown curls, European wealth connections and Ph.D. masked his business inexperience.
And thus we enter the Establishment's presentation of hippie-cum-suit stories, presenting data miners for the Man as rock stars [1]. Sound familiar? It's the same story that worked on the baby boomer generation. "First I was a hippie, then I was a stock broker, now I am a hippie again."
[1] Let's see them give the token opposing community activist on page 4 the same photography treatment instead of photographing him on the margins of the frame so he looks even more obese and distended.
"presenting data miners for the Man as rock stars"
Yes, this really bothers me as well. We should believe in the power this tool has, and that should cause us to be afraid. Biz-mag dorks are always going to eat up such things, but who cares about that. The adulation by people who seem like they should know better is what rankles.
To be fair, I don't think you have to do much propaganda to sell people on the idea of emulating Palantir. Movies and novels like the Bourne series do that for you.
The idea of a brilliant government agent who understands science, math, and law and who frequently has to skirt the murky waters of government intrigue is kind of a sexy zeitgeist in America.
not everyone agrees with you, i'm afraid. more generally, this is something you'll learn as you get older - a lot of things you thought were cool, aren't. not even one direction.
yep, the article is thoroughly seeded with absolutely not subliminal "cool" stimuli especially effective for young males.
Living in the Valley, that one struck home with me, was hard to stop laughing. Aura of danger and secrecy... Dragging with you a 6ft+ 270lb ex-Marine bodyguard _conspicuously_ in Palo Alto. Friday night on University: I'm Dork, James Dork. Well, everybody has his own way to overcompensate.
Or maybe the tech company has, Idk, cool tech. Although I guess appreciating it would take more than a reactionary pants-pissing any time a writer happens to mention 'PRISM' and another (unrelated) company in the same article.
It's technologically damn cool, regardless of your take on a Forbes writer's take on one employee. If someone watches their demos or tours their offices and doesn't walk out at least slightly giddy, they're either paranoid of the solutions or ignorant of the problems. Or both.
>It's technologically damn cool, regardless of your take on a Forbes writer's take on one employee. If someone watches their demos or tours their offices and doesn't walk out at least slightly giddy, they're either paranoid of the solutions or ignorant of the problems. Or both.
My comments are arrogant? You clearly know nothing about neither the company nor the product, yet you assume the allure of it comes from the "cool" stimuli? Maybe you and I have different definitions of "cool," because Palantir's offices certainly aren't "cool" in any "stereotypical young male" type of way. The entire company is filled to the brim with nerdy Tolkein references - not "cool". One of their product lines is versioned by Carebears for god's sake - not "cool".
What makes them cool is the fact that their products solve actual problems (and create some, sure) instead of selling ads or trying to conquer to-do lists. Also for me as a designer, I like their product development structure and cycle.
I guess it would take more than a Forbes article-worth of exposure for you to see something appreciable.
>What makes them cool is the fact that their products solve actual problems [...] instead of selling ads or trying to conquer to-do lists.
yes, The Pitch that youth is falling so easily prey for. Any country, any historic period. The great lie of this pitch only smartest(or luckiest) among us understand from the start, and it takes many years for the rest of us to understand it.
I've always found Palantir terrifying. The name itself is just so brazen. This article's presentation of 'but even if you're sketched out by the defense angle, it's now being used for good!!!' doesn't help.
Also, let's not forget that Palantir helps the FBI infiltrate and attempt to destroy domestic political groups like Occupy.
Only when looked at a certain way. Strike Debt and Occupy Sandy are going strong.
It's hard to say that any decentralized organization 'failed,' exactly. Changing the national conversation isn't exactly failure. But it wasn't outright insurrection either.
And regardless of failure die to internal issue, the executive branch paying money for intel on how to disrupt political groups isn't cool.
I wonder what logic Karp has used to convince Peter Thiel that he's still a libertarian? I mean, talk about cognitive dissonance.
“If we as a democratic society believe that license plates in public trigger Fourth Amendment protections, our product can make sure you can’t cross that line”
The majoritarian argument - That's a popular one amongst libertarians /s.
On the other hand, Jimmy Wales considers himself an Objectivist... I guess, forget the mumbo-jumbo and judge people by their works?
It's not hard. Self-interest and/or the desire to avoid trouble are powerful motivators. Ideology is much less so. If the two conflict, we find ways to bend the ideology to fit, even if to an outside observer it looks like we've turned it completely inside out.
It gets even easier when the outside forces that are compelling us to go along give us pre-written rationalizations for doing so. "You're serving your country." "You're stopping terrorists from hurting innocent people." All we have to do then is just decide to not think about the rationalization we've chosen too hard.
I've spoken to Palantir folks about the libertarian angle, and their story went like this: by maximizing the utilization of available intelligence data, they could minimize the erosion of civil liberties that occurs in the pursuit of more data.
I'm not sure that story survives outside a vacuum, but there you have it.
Palantir's organizational framework will make an excellent case study for whistle blowing, secrecy and privacy. I found the bat phone pretty interesting. In particular, I liked the fact that after a security related incident Palantir retained the employee and strengthened their system; rather than the textbook "we fire you" without any acknowledgment of the issue with the system.
Palantir’s privacy and civil liberties team created an
ethics hotline for engineers called the Batphone: Any
engineer can use it to anonymously report to Palantir’s
directors work on behalf of a customer they consider
unethical. As the result of one Batphone communication,
for instance, the company backed out of a job that involved
analyzing information on public Facebook pages. Karp has
also stated that Palantir turned down a chance to work with
a tobacco firm, and overall the company walks away from as
much as 20% of its possible revenue for ethical reasons.
As much as the article conveys doubts to Karp's moral fortitude, at least one (former) engineer there is willing to speak his mind
> He goes on to argue that even Palantir’s founders don’t quite understand the Palantiri seeing stones in The Lord of the Rings . Tolkien’s orbs, he points out, didn’t actually give their holders honest insights. "The Palantiri distort the truth," he says. And those who look into them, he adds, "only see what they want to see."
Something about Palantir has creeped me the hell out ever since I lived in DC and used to see their ads in areas close to the Pentagon. I am sure its founder is every bit as brilliant as people In this article say.
> the millions of pictures collected by San Leandro’s license plate cameras are now passed on to the Northern California Regional Intelligence Center (NCRIC), one of 72 federally run intelligence fusion organizations set up after 9/11. That’s where the photos are analyzed using software built by a company just across San Francisco Bay: Palantir.
it is beyond creepy, Stasi/KGB could only dream about it.
Extremely creepy: http://www.businessinsider.com/palantir-wikileaks-2011-2#-1
Their recruitment also reeks of creepiness, sending "confidential" letters to my work place (don't have that address listed anywhere, company has many buildings in the same city).
After getting caught. The fact is that the business they're in and the technology they're developing will continue to push them to do questionable things. They rely on large contracts with government and big corporations. They will have to judge (secretly) whether the use of their technology is moral or not. Do you think they will turn down hundreds of millions because their technology is being used for questionable moral purposes?
But they have a "batphone", that let's their employees make anonymous to report ethical abuses. Haha, batphone, isn't that quirky and cool and something we can get onboard with?
I wonder how it is legal for a bank to view data obtained by monitoring the Internet on the scale that only the NSA would be able:
A Palantir user at a bank can, in seconds, see connections between a Nigerian
Internet protocol address, a proxy server somewhere within the U.S. and
payments flowing out from a hijacked home equity line of credit, just as
military customers piece together fingerprints on artillery shell fragments,
location data, anonymous tips and social media to track down Afghani
bombmakers.
Given our (somewhat irrational) risk-aversion towards things like terrorism, the success of this sort of tech seems pretty much inevitable.
The stable state we'll probably reach is a more extreme version of what Karp mentions: next to zero privacy, but also high levels of transparency into the operations of our authorities, and, as a result of both, much more cultural acceptance of "deviancy".
Their civil liberties controls seem like a fairly sane attempt to smooth out the transition to that sort of society (but admittedly of unclear effectiveness). They're trying to avoid a detour into a totalitarian police state while our laws and cultural norms catch up with the reality of our growing data collection and data mining capabilities.
Edit: To clarify the logic of my first sentence above: if we keep voting out elected officials who fail to stop terrorist attacks (which are almost always disproportionately sensational compared to their actual death tolls), we're basically evolving our set of politicians into people who will stop at nothing to prevent them.
> but also high levels of transparency into the operations of our authorities
You are kidding yourself. How about zero privacy for citizens and no transparency and not accountability for government?? I could fall under why they don't share it with me and you, civilians, but they don't even share it with congressmen, judges and anyone else in-between.
That's a dreadful vision. I expect a lot of people got into technology because it's "clean." There little chance of harming people or the environment with what you do. It will be interesting to see if people become disgusted by technology, in general.
Or at least, we've managed to push the unclean parts of it into places far away where we don't have to personally see it -- assembly lines in China, coltan (http://en.wikipedia.org/wiki/Coltan) mines in the Congo, etc. It's not like the old days of, say, the garment business, where the abuse was happening right in New York City for everyone to see (cf. http://en.wikipedia.org/wiki/Triangle_Shirtwaist_Factory_fir...). It's been shuffled off to places where customers and cameras can't go.
This makes it easier to pretend to ourselves that what we do is not tied up with the human costs in those places. It's easy to not think about something you never see.
There would be some benefits. For example, if it became trivial to detect that a person was a user of illegal drugs, we'd very quickly change our drug laws to something more congruent with reality. Instead, we are in a situation where our drug laws are only tolerated because they're completely financially unrealistic to enforce the way they're written. That leads to selective enforcement, which gives authorities way too much discretionary power.
So at the height of the next privacy/NSA/Snowden debates, I guess, I should expect society to be vigorously defending the privacy rights of the next Anthony Wiener.
Yup I am sure that day will come.
Sounds like parent doesn't get much human contact.
So if you are saying in future, Anthony Weiners will get elected, then again I have to disagree with you.
Thanks to human nature, I see the exact opposite happening with our leaders.
And to your main point, the tech is inevitable cause its finally viable. As to its success, I'm highly doubtful
for two reasons-
1) There is a cost to transparency - both to individuals and organizations. Once the cost is felt, information will stop flowing as freely as it does today.
2) Just because there is a machine that can digest unlimited information and produce a list of threats
doesn't mean unlimited resources exists to address all of them, not to mention address them effectively.
Whether it was the boston bombings or the wall street meltdown both events happened despite information being available. Just as economics in the end, reduced nuclear arsenals economics will limit the size and scope of these machines.
I really don't see us reaching any stable state with zero privacy and high transparency levels. Though I can see why a Zuckerberg, Schmidt or Karp might see that. It aligns well with what they do.
I'm terrified of what powerful tools like Palantir can do in the wrong hands, with really any hands without a lot of oversight, but somewhat comforted by this article (and by the Palantir employees I know personally) being generally "techie-libertarian" types -- they at least are more willing to consider the downsides of universal surveillance than a lot of their "customers" are.
It would be amazing if Palantir was a 5-10x better tool for CIA, but came with non-overridable limits which actually protected individual liberty (of US citizens and non-citizens).
There are certain things Palantir can be used for which are essentially purely good -- identifying disease outbreaks, other potential environmental danger, etc. -- which they will hopefully emphasize.
As someone who's worked in text analysis, I'm comforted to see this story told. It's what we should be talking about. Please take the time to read this full article, then go out and do something useful for our world, we're going to need it.
Sorry for the snark, but I call BS on glorification of a technology company that so far has not done anything but strip away almost everyone's privacy.
Like these 'potential' terrorists : "Nihilists, anarchists, activists, Lulzsec, Anonymous, twenty-somethings who haven’t talked to the opposite sex in five or six years."
There are some drug crimes which actually are terrorism. In Mexico, the cartels/Zetas/etc. are an existential threat, and that's spilling over. There are towns in the US where MS-13 is a locally existential threat, at least within immigrant communities. The drug gangs in the inner cities in the 1970s-1990s were, too. I'd be ok with some use of some military force (intelligence, or just straight up infantry and MPs) in certain counter-drug terrorism contexts.
(I say this as someone who is in favor of immediate legalization of MJ, and decriminalization of all other drugs, with extensive treatment available for addiction or sub-optimal drug use.)
I think you mis-interpreted what I meant. I don't think the article glorified this technology and that toward the last half it offered a balanced viewpoint. Regardless, I don't think we quite have to hit the panic button yet, but if we don't start actually improving our world by inventing technology to protect our human rights, they'll be gone before we even have a chance to really panic.
> who received the Nobel Prize in Chemistry in 1918 for his development for synthesizing ammonia, important for fertilizers and explosives. The food production for half the world's current population depends on this method for producing fertilizer.
Or are you talking about his development of chemical weapons? Surely, they weren't as devastating as nuclear weapons which were developed by an astonishing array of celebrated scientists.
> Surely, they weren't as devastating as nuclear weapons which were developed by an astonishing array of celebrated scientists.
Who were all working under the fear that Nazi scientists would develop it first and give it to Adolf Hitler. It's hard to fault them morally for thinking that it would be best for someone else to get to it first.
Except that Hitler actually managed to conquer half the world instead of being a powerless figure used as a convenient bugaboo to scare people into rationalizing immoral acts. You know?
If you're a scientist in, say, 1942, when all of Europe from Spain to Moscow is under Hitler's control, and then someone comes to you and tells you he's working on an atomic bomb, should that prospect not scare you?
Almost all of the scientists who made the A-bomb (with a few notable exceptions like Edward Teller, who never met a bomb he didn't like) agonized over the morality of it. Many organized after the Nazi threat had receded to try and put the genie back in the bottle. It's a smear to categorize them as "good Germans" who were just using Hitler as an easy way to rationalize away unjustifiable behavior.
Well, yes, about his development of chemical weapons. Also believed he did something good. And in the end, his family was exterminated with similar chemicals that he invented.
In the similar vein, Mr. Karp ponders loss of privacy at the end. By the very tools he helped to develop.
Bleh, the /sites/ URL used to be a giveaway that it was UGC. Now I see they're even moving their own content under there, to prevent you from discarding the (worthless) UGC. Sorry for the misinformation.
Current data mining techniques take in massive amounts of similar data and output a parsed signal. Palantir takes in massive amounts of data, preserves all of it, allows their FDEs and clients to hand-draw ontological connections between dissimilar pieces of data, and outputs a gigantic raw data object. Palantir is one of the few entities with the technical know-how and infrastructure to make this gigantic object manipulatable on any old computer or phone.
Where other miners might bring you a gold ring (which may or may not be the right fit for your finger at the time!), Palantir brings you a cart full of raw gold and a map back to the mine shaft. From there you can do whatever you want.
>“How could it be the case that this person is cofounder and CEO since 2005 and the company still exists?” The answer dates back to Karp’s decades-long friendship with Peter Thiel,
>Thiel had cofounded PayPal and sold it to eBay in October 2002 for $1.5 billion.
>Enter Karp, whose Krameresque brown curls, European wealth connections and Ph.D. masked his business inexperience.
And thus we enter the Establishment's presentation of hippie-cum-suit stories, presenting data miners for the Man as rock stars [1]. Sound familiar? It's the same story that worked on the baby boomer generation. "First I was a hippie, then I was a stock broker, now I am a hippie again."
[1] Let's see them give the token opposing community activist on page 4 the same photography treatment instead of photographing him on the margins of the frame so he looks even more obese and distended.