Avatar

I'm Drew Breunig and I obsess about technology, media, language, and culture. I live in New York, studied anthropology, and work at PlaceIQ.

Likes

Posts tagged language

The OED Examines "Bro" 

The OxfordWords blog:

This suggests a certain element of metonymy: by being the sort of person who says “bro,” a person becomes a bro. In the immortal utterance “don’t tase me, bro” it is not the person doing the tasing who is the bro, but the person being tased. Nonetheless, the essence of bro-dom is in the eye of the beholder: precisely what defines the subculture of bros depends on one’s position in time and place, ranging from flannel-shirted frat boys, to laconic surfers, to twenty-something investment bankers. The NPR Codeswitch blog recently delineated 4 basic aspects of bro-iness: jockish, dudely, stoner-ish, and preppy. Their analysis noted that today’s bro is typically, if not exclusively, white, an interesting departure from the earlier African-American connotations of the word. This is a level of nuance that a conventional dictionary entry is ill-suited to describe: the semantic boundaries are subjective and in constant flux.

Colloquial pronouns carry interesting power.

Good question asked by HIlary Mason, pointed out by Whitney McNamara.

I could not agree more with the sentiment: for all non-engineers learning to code should be approached not as trade education but as a component of a rounded education. The goal here isn’t writing code for production but learning how to talk, think, and work with those that do.

That said: the learn-to-code advocates have been TERRIBLE at communicating this nuance. They focus way too much on the acts, not the ideas.

To start with I’d stop saying the word “code” and remove it from all material. I’d also stop talking about so many proper nouns: Python, Ruby, R, whatever. Thinking is not about coding, it’s about concepts. Even the term ‘computer science’ is better.

It might be worth creating a language designed for understanding computer science thinking, if only to make the point that it’s about thinking 100% explicit. A language designed not for production but for education. A language with the goal of teaching us how to hold conversations with the engineers we work with and think about how one might structure problems so that they might be solved by engineers.

Good question asked by HIlary Mason, pointed out by Whitney McNamara.

I could not agree more with the sentiment: for all non-engineers learning to code should be approached not as trade education but as a component of a rounded education. The goal here isn’t writing code for production but learning how to talk, think, and work with those that do.

That said: the learn-to-code advocates have been TERRIBLE at communicating this nuance. They focus way too much on the acts, not the ideas.

To start with I’d stop saying the word “code” and remove it from all material. I’d also stop talking about so many proper nouns: Python, Ruby, R, whatever. Thinking is not about coding, it’s about concepts. Even the term ‘computer science’ is better.

It might be worth creating a language designed for understanding computer science thinking, if only to make the point that it’s about thinking 100% explicit. A language designed not for production but for education. A language with the goal of teaching us how to hold conversations with the engineers we work with and think about how one might structure problems so that they might be solved by engineers.

"Google has prototypes for 'real-time' translation device" 

Engadget writes:

The biggest barrier, beyond the translation itself, is speech recognition. In so many words, background noise interferes with the translation software, thus affecting results. But Barra said it works “close to 100 percent” when used in “controlled environments.” Sounds perfect for diplomats, not so much for real-world conversations.

Diplomatic conversations is exactly where you should not be testing real-time translation software. Not just because these are the conversations that manage military and economic results, but also because they’re so unlike normal speech.

For example, a piece from last week’s Talk of the Town by Lauren Collins featured a senior translator within the EU and his quixotic quest to fix ‘Eurenglish’:

Gardner explained that he had been motivated by the declining quality of E.U. documents, which, he said, are increasingly written by people for whom English is a second (or third or fourth) language. Many of the mistakes he identifies are false friends (“actual” for “current,” “assist at” for “attend”) or reverse-engineered oddities (“transpose” has somehow come to mean “implement”). If Gardner has his way, cows, sheep, goats, and pigs will cease to be “bovine, ovine, caprine, and porcine animals.” We will no longer see such sentences as ‘When the interoperability constituent is integrated into a Control-Command and Signaling On-board or Track-side Subsystem, if the missing functions, interfaces, or performances do not allow to assess whether the subsystem fully complies with the requirements of the TSI, only an Intermediate Statement of Verification may be issued.”

Good luck, Google. Humans are messier than you’ll ever know.

On “Hashtag” and Remembering the Internet is Awkward

Last night, “hashtag” was crowned Word of the Year for 2012. Ben Zimmer, chair of the New Words Committee of the American Dialect Society, notes:

This was the year when the hashtag became a ubiquitous phenomenon in online talk. In the Twittersphere and elsewhere, hashtags have created instant social trends, spreading bite-sized viral messages on topics ranging from politics to pop culture.

I think hashtag is an excellent choice, but think Zimmer is underselling the decision.

Hashtag exemplifies the awkward ways we attempt to speak so computers will understand.

Computers don’t understand us. They’re getting better, but this last mile is turning out to be a doozy. Siri garbles every third word and struggles with accents, Google trips on words, and Facebook and iPhoto facial recognition systems see faces where there aren’t any. People are messy and the real world isn’t clean. It’s hard for computers to understand us.

The hashtag is us giving them a hand, providing a clue to our intentions they can easily parse. Hashtags are us talking loud and slow in a foreign land. They’re awkward, which is precisely why they’re important to note.

Hashtags remind us that interfaces are hard. When Google makes a promise about Google Glass, remember the hashtag. When Ray Kurzweil brings up the singularity again, point him towards the hashtag.

The hashtag shows that human/machine interactions are a negotiation. They come two steps towards us, we come one step towards them.

The hashtag also illustrates the loneliness of the internet.

But the goal of the hashtag isn’t for the computer to understand us; this is a means towards an end. The true goal of the hashtag is to connect with others, to be discovered, and to be part of a community – all online. The hashtag is us asking the computer for help, for it to put in a good word for us when someone else is searching for pictures, words, or videos we’re posting.

The hashtag is awkward and that’s why it matters. It reminds us how imperfect computer interactions are right now.

Cultural Variations on “Have one’s cake and eat it too.”

Know the idiom, know the culture:

  • Bulgaria: “Both the wolf is full, and the lamb is whole.”
  • Denmark: “You cannot both blow and have flour in your mouth.”
  • France: “To want the butter and the money from (selling) the butter.” (The idiom can be emphasized by adding, “and the smile of the female buttermaker”)
  • Germany: “Please wash me, but don’t get me wet!”
  • Switzerland: “You can’t have the five cent coin and a Swiss bread roll.”
  • Greece: “You want the entire pie and the dog full.”
  • Italy: “To have the barrel full and the wife drunk.”
  • Russia: “It’s hard to have a seat on two chairs at once.”
  • Spain: “Wishing to be both at Mass and in the procession.”

Ah, Italy… (Via Wikipedia)

What Other Countries Call the "@" Symbol 

Lifted from Huffington Post because bullets are better than photo slideshows:

  • Denmark: Elephant’s Trunk
  • Germany & Poland: Monkey’s Tail
  • Greek: Little Duck
  • Hebrew: Strudel
  • Italy: Snail
  • China: Mouse
  • Kazakhstan: Moon’s Ear
  • Russia: Dog Face
  • Finland: Sleeping Cat
  • Korea: Sea Snail
  • Hungary: Maggot

In short: Americans are unimaginative.

When will legal writing become a programming language?

New York Magazine reports on an interesting development in computer/human relations:

The Southern District of New York recently became the nation’s first federal court to explicitly approve the use of predictive coding, a computer-assisted document review that turns much of the legal grunt work currently done by underemployed attorneys over to the machines.

Last month, U.S. Magistrate Judge Andrew J. Peck endorsed a plan by the parties in Da Silva Moore v. Publicis Groupe — a sex discrimination case filed against the global communications agency by five former employees — to use predictive coding to review more than 3 million electronic documents in order to determine whether they should be produced in discovery, the process through which parties exchange relevant information before trial.

Analysts expect decisions like this to open the door for an eruption of computer analyzed legal work.

A few questions:

  1. How long will it be before lawyers are explicitly trained to write in a way which will be favorably interpreted by software?
  2. After this happens, how long will it be before legal writing evolves into a scripting language, more code to be compiled than words to be understood by humans? When will the first O’Reilly book be published for this language?
  3. Will computer/human standards emerge for other fields or discourses? SEO copywriting is already on its way. What other fields might follow suit? Sports journalism?
Next page Something went wrong, try loading again? Loading more posts