Link tags: cog

38

sparkline

Prescriptive and Descriptive Information Architectures | Jorge Arango

Interesting—this is exactly the same framing I used to talk about design systems a few years ago.

Your brain does not process information and it is not a computer | Aeon Essays

We don’t store words or the rules that tell us how to manipulate them. We don’t create representations of visual stimuli, store them in a short-term memory buffer, and then transfer the representation into a long-term memory device. We don’t retrieve information or images or words from memory registers. Computers do all of these things, but organisms do not.

How normal am I?

A fascinating interactive journey through biometrics using your face.

What the Vai Script Reveals About the Evolution of Writing - SAPIENS

How a writing system went from being a dream (literally) to a reality, codified in unicode.

This Word Does Not Exist

This is easily my favourite use of a machine learning algorithm.

Historical Dictionary of Science Fiction

A fascinating crowdsourced project. You can read the backstory in this article in Wired magazine.

Talking out loud to yourself is a technology for thinking | Psyche Ideas

This explains rubber ducking.

Speaking out loud is not only a medium of communication, but a technology of thinking: it encourages the formation and processing of thoughts.

Make Me Think | Jim Nielsen’s Weblog

The removal of all friction should’t be a goal. Making things easy and making things hard should be a design tool, employed to aid the end user towards their loftiest goals.

Phenological Mismatch - e-flux Architecture - e-flux

Over the last fifty years, we have come to recognize that the fuel of our civilizational expansion has become the main driver of our extinction, and that of many of the species we share the planet with. We are now coming to realize that is as true of our cognitive infrastructure. Something is out of sync, felt everywhere: something amiss in the temporal order, and it is as related to political and technological shifts, shifts in our own cognition and attention, as it is to climatic ones. To think clearly in such times requires an intersectional understanding of time itself, a way of thinking that escapes the cognitive traps, ancient and modern, into which we too easily fall. Because our technologies, the infrastructures we have built to escape our past, have turned instead to cancelling our future.

James writes beautifully about rates of change.

The greatest trick our utility-directed technologies have performed is to constantly pull us out of time: to distract us from the here and now, to treat time as a kind of fossil fuel which can be endlessly extracted in the service of a utopian future which never quite arrives. If information is the new oil, we are already, in the hyper-accelerated way of present things, well into the fracking age, with tremors shuddering through the landscape and the tap water on fire. But this is not enough; it will never be enough. We must be displaced utterly in time, caught up in endless imaginings of the future while endlessly neglecting the lessons and potential actions of the present moment.

The ineffectiveness of lonely icons | Matt Wilcox, Web Developer & Tinkerer

When in doubt, label your icons.

When not in doubt, you probably should be.

The Real Danger To Civilization Isn’t AI. It’s Runaway Capitalism.

Spot-on take by Ted Chiang:

I used to find it odd that these hypothetical AIs were supposed to be smart enough to solve problems that no human could, yet they were incapable of doing something most every adult has done: taking a step back and asking whether their current course of action is really a good idea. Then I realized that we are already surrounded by machines that demonstrate a complete lack of insight, we just call them corporations.

Related: if you want to see the paperclip maximiser in action, just look at the humans destroying the planet by mining bitcoin.

Cognitive Overload - daverupert.com

From Scott McCloud to responsive design, Dave is pondering our assumptions about screen real estate:

As the amount of information increases, removing details reduces information density and thereby increasing comprehension.

It reminds me of Edward Tufte’s data-ink ratio.

Re: Brand | Happy Cog

After Clearleft’s recent rebranding, I’m really interested in Happy Cog’s redesign process:

In the near future we’ll be rolling out a new website, followed by a rebrand of Cognition, our blog. As the identity is tested against applications, much of what’s here may change. Nothing is set in stone.

Improving accessibility in Co-op wills – Digital blogs

Some interesting insights from usability and accessibility testing at the Co-op.

We used ‘nesting’ to reduce the amount of information on the page when the user first reaches it. When the user chooses an option, we ask for any other details at that point rather than having all the questions on the page at once.

Questions for our first 1:1 | Lara Hogan

Shamefully, I haven’t been doing one-to-ones with my front-end dev colleagues at Clearleft, but I’m planning to change that. This short list of starter questions from Lara will prove very useful indeed.

Building the Happy Cog Test Lab - Cognition: The blog of web design

Ryan describes the research and process behind putting together a device lab for Happy Cog in Austin. Good stuff, with handy links gathered together at the end.

Lexadecimal

Hexadecimal colours and their corresponding dictionary definitions. Cute.

Op-Ed Contributor - Mind Over Mass Media - NYTimes.com

An excellent rebuttal by Steven Pinker to Nicholas Carr's usual trolling.

Humanising data: introducing “Chernoff Schools” for Ashdown – Blog – BERG

Matt gets an opportunity to use the Chernoff effect for visualising school data.

Racist Camera! No, I did not blink... I'm just Asian! on Flickr - Photo Sharing!

"Nikon, the racist camera" (sing it to the tune of Flight of the Concords' "Albi, the racist dragon").

Racist Camera! No, I did not blink... I'm just Asian!