Install Theme

Just read a pharmacology study (this one – yes, more marijuana research) testing bioequivalence for a new formulation of a drug.  In addition to being a solution rather than a capsule, the new formulation contained only 85% as much of the active ingredient.

The maximum concentration reached with the new version?  82.5% of the maximum concentration reached with the old version.  And only 77.3% for the active metabolite.  (Those are point estimates; the 90% confidence intervals maxed out at 91.2% and 82.5%, respectively.  These are all estimates of the geometric-mean ratio between the two, BTW.)  So basically, it looks like you’d expect from the slightly smaller dose, if things were linear in dose.

But they concluded that the two formulations were equivalent, since the numbers, although lower, were within the FDA ranges for establishing bioequivalence.

I got nerd-sniped by wondering if you could actually fallacy-of-composition this into establishing that 0% of the original dose was equivalent to 100%, via a series of studies with declining doses.  It turns out that the FDA standards are complicated (1, 2), and I’m still not sure whether you could.  But still, WTF??

shedoesnotcomprehend:

nostalgebraist:

I feel like I’ve made this post before, but it surprises me how little change we’ve seen in publication-quality written English since the rise of computers and word processors.

I was just reading a long convoluted sentence in a book from the 1940s and thinking “wow, this guy had to compose that without a delete key” – and then it struck me that, if anything, we write fewer long convoluted sentences now, even though we have delete keys.

with a delete key, when you get yourself into the middle of a sentence like that you can go back to the beginning and rephrase it

That’s true.  “Convoluted” was bad phrasing on my part, since it’s a word with negative connotations.  I would say “intricate,” but the connotations there are too positive.  Really I just meant “syntactically complicated.”

Some sentences suffer from being syntactically complicated and some benefit (and which is which depends greatly on taste), but in any case I think we write fewer of them now even though our technology (I would think?) makes them easier to write.

(via condensed-theorem-shop)

He jumped, uninvited, into the final stretch of a girls’ track meet, apparently intent on proving his athletic supremacy over the opposite sex. (The White House, reaching for exculpatory context, noted that this was a girls’ team from another school, not his own.)

I feel like I’ve made this post before, but it surprises me how little change we’ve seen in publication-quality written English since the rise of computers and word processors.

I was just reading a long convoluted sentence in a book from the 1940s and thinking “wow, this guy had to compose that without a delete key” – and then it struck me that, if anything, we write fewer long convoluted sentences now, even though we have delete keys.

Curse 3

Curse 3

Yet, with the utmost complacency, he stages for us a banquet, and a banquet at which we are invited to entertain the forbidding hypothesis that unseasoned and unappetizing food may be served, but at which we are finally relieved to see the diners, their appetite “whetted” by “sensuality,” fall upon food that has been delicately prepared.

It was not a sign of Philip’s littleness but of his greatness that he could get so vehement in a dispute with a little girl. Napoleon would have done so; so would Alexander the Great; so would Nelson, so would Achilles. Most modern rulers would have laughed at her and retorted with some quip too ironical for her to understand.

babydurazno:
“I thought this was a meme and I was trying so hard to get it
”

babydurazno:

I thought this was a meme and I was trying so hard to get it

(via rocketverliden)

typicalacademic:

nostalgebraist:

Weak, awkward, but lords of the virtual realms, programmers were as powerful in the electronic world as they were impotent in the real one – until INTERCAL, the first language that didn’t respect them, the first language whose grammar turned them into subs.

INTERCAL grinds a programmer into the dirt in a few creative ways. For example, a unique and undocumented feature of the compiler (the program that translates INTERCAL syntax into the machine code that the processor understands) was its requirements for politesse. This was represented through the PLEASE qualifier, an essential aspect of INTERCAL grammar. If PLEASE was not encountered often enough, the program would be rejected; that is, ignored without explanation by the compiler. Too often and it would still be rejected, this time for sniveling. Combined with other words that are rarely used in programming languages but appear as statements in INTERCAL, the code reads like someone pleading:


(1000) PLEASE ABSTAIN FROM IGNORING + 

FORGETTING

(1001) IGNORE :1 

(1002) PLEASE DO 24% :1 <- #0$#256

(1003) REMEMBER :1

(1004) READ OUT :1

(1005) PLEASE DON’T GIVE UP

(1006) GIVE UP

holy shit

I’m also a fan of the erotically charged Intercal “Hello World” program:

DO ,1 <- #13
PLEASE DO ,1 SUB #1 <- #238
DO ,1 SUB #2 <- #108
DO ,1 SUB #3 <- #112
DO ,1 SUB #4 <- #0
DO ,1 SUB #5 <- #64
DO ,1 SUB #6 <- #194
DO ,1 SUB #7 <- #48
PLEASE DO ,1 SUB #8 <- #22
DO ,1 SUB #9 <- #248
DO ,1 SUB #10 <- #168
DO ,1 SUB #11 <- #24
DO ,1 SUB #12 <- #16
DO ,1 SUB #13 <- #162
PLEASE READ OUT ,1
PLEASE GIVE UP

(from Wikipedia)

Weak, awkward, but lords of the virtual realms, programmers were as powerful in the electronic world as they were impotent in the real one – until INTERCAL, the first language that didn’t respect them, the first language whose grammar turned them into subs.