ysabetwordsmith: Cartoon of me in Wordsmith persona (Default)
ysabetwordsmith ([personal profile] ysabetwordsmith) wrote2025-10-20 12:56 pm

Magpie Monday

[personal profile] dialecticdreamer is hosting Magpie Monday with a theme of "hidden expectations or accidental entanglements."

Over on [personal profile] ysabetwordsmith’s blog here on Dreamwidth, a comment on her poem “New and Innovative Approaches,” led to the theme for this month’s Magpie Monday.

Because there’s such a strong tie to Frank the Crank’s write-in election to the Mercedes city council, I’d like to specify that all the prompts remain in or around Mercedes, in the Polychrome Heroics universe. It’s still possible to write a near-infinite range of prompts which connect to one or both of the broad ideas of this month’s theme. Modern society has an enormous number of hidden expectations, based on one’s age, gender, skin tone, ability or disability, and even one’s mental health. How do those expectations affect life as the town is rebuilding after a tremendous, world-changing earthquake? Is it easier to stay, where there are familiar faces and resources (stretched to the breaking point), or to leave, to possibly escape a hidden expectation?
In the Pipeline ([syndicated profile] in_the_pipeline_feed) wrote2025-10-20 01:05 pm

When Variant Proteins Aren't Actually the Variant Ones

Now, transcription and translation are indeed wonders of nature. The constant reading-off of our genetic code and its expression into proteins kind of has to be at that level, you’d figure, for living cells to work at all. But it’s important to remember that not-so-exact versions of these things are important, too. I’ve written about how error-prone mechanisms can be useful for bacteria and viruses (and indeed, how switching gears to these can be an outright roll-the-dice survival mechanism). But what about us?

Well, as impressive as the fidelity usually is, there are a lot of errors that still creep in just because of the scale of the process. And this recent preprint is an invitation to rethink our attitudes toward that. The authors have done painstaking sequencing across both normal and tumor tissue samples looking for alternate translation events. They’re out there, and the amounts of the resulting proteins depend (as the paper emphasizes) both on their rates of synthesis and their rates of degradation. You can’t rule that latter one out: some of these changes may actually allow variant proteins to accumulate.

The authors rooted through huge piles of LC/MS data from those human samples and identified nearly nine million spectra that were identified as modified versions of known peptides. About 40% of those were good ol’ post-translational modification, and those were set aside for another day. But there were 124,000 sequences that looked to be single-amino-acid substitutions with no other modifications. There was no evidence for these at the genomic level, supporting the idea that these are mistakes in transcription and translation instead. The authors chopped that list down pretty rigorously, and by their own admission probably threw out plenty of real examples, but at the end they were left with about 9,000 unique examples, 2,000 of which were very well localized on the proteins themselves. Only about 110 of those 9,000 on the list had been previously predicted from genomic translations.

Looking at the abundance of these alternate sequences compared to the “base sequences” they were derived from was revealing. Most of them were at a lower level, as you’d certainly have predicted. But about 10% were at a higher level. In fact, the paper estimates that for at least 360 proteins the alternate form is by far more abundant than the canonical one! They call this “highly unexpected”, and I have to agree. The rest of the paper features several different attempts for them to get their heads around it.

That started with several attempts to explain it away due to incomplete RNA sequencing, problems with peptide ionization and detectability in the mass spec, possible origins in cleavages of other highly abundant normal precursor proteins, and more. But none of these come close to accounting for the numbers. Stipulating, then, that this result was real, they went on to analyze it across patients, proteins, and tissue types. These data strong suggest that there are underlying biological reasons for some of those alternate-to-baseline protein ratios. Factors that influence these ratios at the RNA level include the number of codon changes that have to occur, the underlying frequency of those baseline codons (rarer ones are more likely to be substituted), and the level of uracil modifications present.

Up at the protein level, it seems that higher stability of the alternate versions (as mentioned above) is a big factor as well. Shorter peptides are less likely to have more-stable alternates than longer ones (probably because longer ones are less stable to start with). There are tissue differences (for example, a strong tendency for G-to-S substitutions in pancreatic samples, and overall, polar amino acids are significantly more likely to be involved in the high-ratio cases. As for tumor tissue versus normal, there are some real outliers there as well, with major differences in ratios across different tumor types and in tumor-versus-normal comparisons. A look at function suggests that signaling sequences and proteasome subunits are over-represented, too, as are proteins known to be involved in neurodegenerative diseases. Finally, comparing human versus mouse tissue strongly suggest that many (perhaps most) of these changes are even conserved across species!

Well, this is all certainly something to think about, and these results open up plenty of new areas for research. The authors emphasize that their strong focus on eliminating false positives probably created plenty of false negatives along the way - that is, the data presented here, as head-scratching as they are, surely only represent part of the real situation.

Mandatory snarky aside: gosh, we should have though to ask AI about all this sooner, don’t you think? Could have saved ourselves lots of time. Similarly, I wonder if the people talking about creating a “digital cell” to let AI tell us all about biochemistry will get around to incorporating this stuff any time soon. . .

cyberghostface: (Joker)
cyberghostface ([personal profile] cyberghostface) wrote in [community profile] scans_daily2025-10-20 01:00 pm

Deadpool / Batman #1



“In Batman we’ve found someone who has even less time for Deadpool’s antics than Wolverine, but a city-wide threat from the Joker makes strange bedfellows (literally, if Deadpool had his way). It’s been a blast letting Deadpool loose in Gotham City and watching what happens.” -- Zeb Wells

Scans under the cut... )
alierak: (Default)
alierak ([personal profile] alierak) wrote in [site community profile] dw_maintenance2025-10-20 10:11 am

AWS outage

DW is seeing some issues due to today's Amazon outage. For right now it looks like the site is loading, but it may be slow. Some of our processes like notifications and journal search don't appear to be running and can't be started due to rate limiting or capacity issues. DW could go down later if Amazon isn't able to improve things soon, but our services should return to normal when Amazon has cleared up the outage.
paserbyp: (Default)
paserbyp ([personal profile] paserbyp) wrote2025-10-20 03:01 pm
Entry tags:
andrewducker: (screaming hedgehog)
andrewducker ([personal profile] andrewducker) wrote2025-10-20 03:54 pm
Entry tags:

Life with two kids: A short attention span

Sophia: "So mummy took a year off from her job when I was born and then she went back?

Incredulously: "And they remembered who she was?"
brithistorian: (Default)
brithistorian ([personal profile] brithistorian) wrote2025-10-20 09:17 am
Entry tags:

My 0.02 Euro on the Louvre jewel heist

In case you've not heard about yesterday's theft of some of the French Crown Jewels from the Louvre yesterday, CNN has a good article about it. There is one paragraph from the article that I have issues with:

Christopher Marinello, the founder of Art Recovery International, said that if the thieves are just looking to get cash out as quickly as possible, they might melt down the precious metals or recut the stones with no regard for the piece’s integrity.

I suppose it's technically true that they might do this; I just don't think it's at all likely. I don't think the thieves will be looking to cash out quickly because, given the degree of planning that apparently went into this operation, I think the items were sold before they were even stolen. I think it likely that their new owner, who probably lives in Russia or the Middle East, has already taken possession of them. (And if I were one of the thieves, I'd be extremely worried that said owner might decide that their generous payment for the items wasn't sufficient to ensure my ongoing silence.)

prettygoodword: text: words are sexy (Default)
prettygoodword ([personal profile] prettygoodword) wrote2025-10-20 07:11 am

bycocket / bycoket

bycocket or bycoket (BAI-kaw-kuht) - n., a hat with a high crown and a wide brim turned up in back and coming to a point like a beak in front, worn especially in medieval Europe.


St. Helena in her fashionable (as of 1380) bycocket
Thanks, WikiMedia!

Now mostly associated with Robin Hood, but it was fashionable for both men and women (see pic) from the 1200s-1500s. Also sometimes called abacot or abococket, formed by assimilating "a bycocket" into one word. In French this is now called a chapeau à bec, hat with a beak, but the original name (which English took on) was bicoquet, from bi-, double +‎ coque, shell, which I can sort of see.

---L.
andrewducker: (Default)
andrewducker ([personal profile] andrewducker) wrote2025-10-20 03:13 pm
Entry tags:

Life with two kids: bus trip entertainment

Gideon, climbing on to Sophia's lap: "I'll be Alexa."
Sophia: "Alexa, play Soda Pop"
Gideon: sings Soda Pop
Sophia: joins in
brithistorian: (Default)
brithistorian ([personal profile] brithistorian) wrote2025-10-20 08:33 am
Entry tags:

QOTD: On exihibitions

“Exhibitions, like dreams, are temporary phenomena — but, also like dreams, they leave indelible traces in our experiences. Through a dialectical short circuit, exhibitions draw from the material culture of the past, are situated in the present, and anticipate futures.” (Adam Szymczyk, in “Passages: Koyo Kouoh, 1967-2025,” Artforum, Sept. 2025)

This was something I really enjoyed learning about in my museum studies classes. An exhibition tells a story. Sometimes it's a simple story, like people "People like Monet and our museum needs money." (Although hopefully even an exhibition like that can still tell a deep story.) Sometimes its a more complicated story, like "Here are some interesting and/or controversial things that contemporary artists are doing. You may find some of them shocking, but you should see them anyway.". And sometimes, an exhibition tells a story that can totally change the way people things about something, such that the exhibition lives on in peoples minds long after the wall tags have been taken down and the objects have been returned to storage.

For example, I would be very surprised to find someone who'd studied art history or museum studies in the US who had never heard of the 1992 Maryland Historical Society exhibition "Mining the Museum". This exhibition was mentioned in several of my classes, to the point that as soon as we heard "1992" and "Maryland" together, we'd start nodding, knowing what was coming next. In this exhibition, conceptual artist Fred Wilson combined items from the museum's collection that would typically be found in an art exhibition with items that are tied to the state's slave-owning past and would usually be hidden when discussing the art of the era. One photograph from the exhibition has become a shorthand for the whole thing. It's of a case labeled simply "Metalwork, 1793-1880," which contains a number of elaborate silver cups and pitchers as well as a pair of iron slave shackles.

The story that the exhibit designer is trying to tell is generally summarized in the large wall text at the beginning of the exhibition, which I've observed many people to skip over in their rush to get to the "good stuff" (i.e. the objects). If you're someone who skips over the wall text at the beginning of an exhibition, I'd like to urge you to do not do that — the experience of viewing the items will be even richer if you have this story in your mind as you view them. And if you're someone who already reads the wall text (thank you!), try keeping that story further to the front of your mind as you view the exhibition. You'll come to see that not only do the individual items have meaning, but the order in which you encounter them as you move through the exhibition and they ways in which they're juxtaposed spatially will also contribute to telling the story.

rolanni: (Default)
rolanni ([personal profile] rolanni) wrote2025-10-20 09:08 am

Gothic Monday

What went before ONE:  So that's +/-1,320 words on the morning. I'm not reporting the impact of these words on the WIP total because I don't know exactly where they go.

The cats were all waiting for me when I got to Steve's Office, and they stuck with me until I said, "That's a wrap," whereupon Rook and Tali got up, stretched, and followed me to the front of the house.

It's now time to have lunch, then go downstairs to perform one's duty to the cats, and monkey around with my glass for a bit.

The cloudy morning has become a sunny afternoon, though still cool.

And so it goes.
#
What went before TWO:  Aaaand that's enough fun for one day! I have finished cutting what glass I can. As Was Predicted, I did break the starfish -- twice, but the second time much better than the first (Do not laugh. The bar we're using here, as Miri Robertson once famously said, is the one that's buried in that snowpile over there). So, rather than run out of glass, I shall take what I have with me to class, prepared to Learn Better.

There's a horrifying amount of glass pieces in my scrap box. Honestly, I should go into the kaleidoscope business.

Also, the project got its tithe of blood today, so I was glad I had wimped in and taken my silly little first aid kit down to The Studio.

But! All that said -- I'm for a cup of tea and a bun, and then I do believe I'll read.

Everybody have a good evening. I'll check in tomorrow.
#
What went before THREE: New entry in Steve and Sharon's Excellent Adventure, for those who are reading along: Eager Street
#
Um. Monday? Cool and damp; rain in the forecast.

Updated my books read list -- I have read my 50th book, which is something of a relief; I really didn't think I was going to see that many.

Read the first eight chapters of the book club book last night. I really can't tell if the ... predictability is a feature or a bug. As in, yes, this; yes, this, too; no that's pretty flimsy, but it gets us where we're going; ok, yeah, they lied, what a surprise -- is just the entrance ramp into the Real Story* (feature) or if, having begun, this is how we mean to go on. Well. I'll find out.

In other news, I was inclined to feel Poorly Used when I got the news that my health insurance will be going up $30 a month in 2026, but that was before I read the newspaper and found out that this same insurance provider is dropping membership for half the state. Yes, the half that needs it the most, why do you ask?

Sigh. It's possible that Mondays aren't good for me.

P'rhaps I'll go find a cup of tea and something for breakfast.

How's everybody doing this morning?
________
*I almost had a fistfight on a panel regarding the beginning of The Goblin Emperor, in which,** and my fellow panelist was insisting that it was Bad Storytelling because Basic Security mandates that you Don't Do That, and my equally empassioned argument that this was just to "explain" how we got to the Unlikely Situation which was the Actual Story the writer wanted to tell. Wow, that was an exhausting panel.
SPOILER
SPOILER
SPOILER
SPOILER
SPOILER
SPOILER
_________
**The Emperor and all of his sons are on the same airship when it blows up.


John D. Cook ([syndicated profile] johndcook_feed) wrote2025-10-20 12:12 pm

Distribution of correlation

Posted by John

One of the more subtle ideas to convey in an introductory statistics class is that statistics have distributions.

Students implicitly think that when you calculate a statistic on a data set, say the mean, that then you have THE mean. But if your data are (modeled as) samples from a random variable, then anything you compute from those samples, such as the mean, is also a random variable. When you compute a useful statistic, it’s not as random as the data, i.e. it has smaller variance, but it’s still random.

A couple days ago I wrote about Fisher’s transform to make the distribution sample correlations closer to normal. This post will make that more concrete.

Preliminaries

We’ll need to bring in a few Python libraries. While we’re at it, let’s set the random number generator seed so the results will be reproducible.

import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import skew

np.random.seed(20251020)

Correlated RNG

Next, we’ll need a way to generate correlated random samples, specifying the correlation ρ and the sample size N.

def gen_correlated_samples(rho, N):
    mean = [0, 0]
    cov = [
        [1, rho],
        [rho, 1]
    ]
    return np.random.multivariate_normal(mean, cov, size=N)

Calculating correlation

Once we generate correlated pairs, we need to calculate their correlation. To be more precise, their linear (Pearson) correlation. To do this we’ll find the empirical covariance matrix, the sample counterpart to the covariance matrix specified in the generator code above. The correlation coefficient is then the off-diagonal element of the covariance matrix.

def pearsonr(X):
    correlation_matrix = np.corrcoef(X[:,0], X[:,1])
    return correlation_matrix[0, 1]

Simulation

Now we’re ready to do our simulation.

M = 10000
rs = np.zeros(M)
for i in range(M):
    X = gen_correlated_samples(0.9, 100)
    rs[i] = pearsonr(X)

Notice that there are two levels of sampling. We’re generating random samples of size 100 and computing their correlation; that’s sampling our underlying data. And we’re repeating the process of computing the correlation 10,000 times; that’s sampling the correlation.

Untransformed distribution

Next we view the distribution of the correlation values.

plt.hist(rs, bins=int(np.sqrt(M)))
plt.show()
plt.close()

This gives the following plot.

It’s strongly skewed to the left, which we can quantify by calculating the skewness.

print(skew(rs))

This tells us the skewness is −0.616. A normal distribution has skewness 0. The negative sign tells us the direction of the skew.

Transformed distribution

Now let’s apply the Fisher transformation and see how it makes the distribution much closer to normal.

xformed = np.arctanh(rs)
plt.hist(xformed, bins=int(np.sqrt(M)))
plt.show()
plt.close()
print(skew(xformed))

This produces the plot below and prints a skewness value of −0.0415.

Small correlation example

We said before that when the correlation ρ is near zero, the Fisher transformation is less necessary. Here’s an example where ρ = 0.1. It’s not visibly different from a normal distribution, and the skewness is −0.1044.

Observation and conjecture

In our two examples, the skewness was approximately −ρ. Was that a coincidence, or does that hold more generally? We can test this with the following code.

def skewness(rho):
    rs = np.zeros(M)
    for i in range(M):
        X = gen_correlated_samples(rho, 100)
        rs[i] = pearsonr(X)
    return skew(rs)
    
rhos = np.linspace(-1, 1, 100)
ks = [skewness(rho) for rho in rhos]
plt.plot(rhos, ks)
plt.plot(rhos, -rhos, "--", color="gray")
plt.show()

Here’s the resulting plot.

It looks like the skewness is not exactly −ρ, but −cρ for some c < 1. Maybe c depends on the inner sample size, in our case 100. But it sure looks like skewness is at least approximately proportional to ρ. Maybe this is a well-known result, but I haven’t seen it before.

The post Distribution of correlation first appeared on John D. Cook.
rolanni: (lit'rary moon)
rolanni ([personal profile] rolanni) wrote2025-10-20 07:53 am
Entry tags:

Books read in 2025

50 Emilie and the Hollow World, (Emilie Adventures #1) Martha Wells (e)
49 Black Tie & Tails (Black Wolves of Boston #2), Wen Spencer (e)
48 Shards of Earth, Adrian Tchaikovsky(The Final Architecture #1)e)
47  Hemlock and Silver, T. Kingfisher (e)
46  Outcrossing, Celia Lake (Mysterious Charm #1) (e)
45  Outfoxing Fate, Zoe Chant/Murphy Lawless (Virtue Shifters)(e)
44  Atonement Sky, Nalini Singh (Psy-Changeling Trinity #9) (e)
43  Stone and Sky, Ben Aaronovitch (Rivers of London #10) (e)
42  Regency Buck, Georgette Heyer (re-re-re-&c-read)
41  I Dare, Sharon Lee and Steve Miller (Liaden Universe #7) (page proofs)
40  To Hive and to Hold, Amy Crook (The Future of Magic #1) (e)
39  These Old Shades, Georgette Heyer, narrated by Sarah Nichols (re-re-re-&c-read, 1st time audio)
38  Faking it (Dempsey Family #2), Jennifer Crusie, narrated by Aasne Vigesaa (re-re-re-&c-read, 1st time audio)
37  Copper Script, K.J. Charles (e)
36  The Masqueraders, Georgette Heyer, narrated by Eleanor Yates (re-re-re-&c-read; 1st time audio)
35  Everyone Here Spoke Sign Language: Hereditary Deafness on Martha's Vineyard, Nora Ellen Groce (e)
34  Miss Pettigrew Lives for a Day, Winifred Watson, narrated by Frances McDormand (re-re-re-&c-read; 1st time audio)
33  The Wings upon Her Back, Samantha Mills (e)
32  Death on the Green (Dublin Driver #2), Catie Murphy (e)
31  The Elusive Earl (Bad Heir Days #3), Grace Burrowes (e)
30  The Mysterious Marquess (Bad Heir Days #2), Grace Burrowes (e)
29  Who Will Remember (Sebastian St. Cyr #20), C.S. Harris (e)
28  The Teller of Small Fortunes, Julie Leong (e)
27  Check and Mate, Ali Hazelwood (e)
26  The Dangerous Duke (Bad Heir Days #1), Grace Burrowes (e)
25  Night's Master (Flat Earth #1) (re-read), Tanith Lee (e)
24  The Honey Pot Plot (Rocky Start #3), Jennifer Crusie and Bob Mayer (e)
23  Very Nice Funerals (Rocky Start #2), Jennifer Crusie and Bob Mayer (e)
22  The Orb of Cairado, Katherine Addison (e)
21  The Tomb of Dragons, (The Cemeteries of Amalo Trilogy, Book 3), Katherine Addison (e)
20  A Gentleman of Sinister Schemes (Lord Julian #8), Grace Burrowes (e)
19  The Thirteen Clocks (re-re-re-&c read), James Thurber (e)
18  A Gentleman Under the Mistletoe (Lord Julian #7), Grace Burrowes (e)
17  All Conditions Red (Murderbot Diaries #1) (re-re-re-&c read) (audio 1st time)
16  Destiny's Way (Doomed Earth #2), Jack Campbell (e)
15  The Sign of the Dragon, Mary Soon Lee
14  A Gentleman of Unreliable Honor (Lord Julian #6), Grace Burrowes (e)
13  Market Forces in Gretna Green (#7 Midlife Recorder), Linzi Day (e)
12  Shakespeare: The Man Who Pays the Rent, Judi Dench with Brendan O'Hea (e)
11  Code Yellow in Gretna Green (#6 Midlife Recorder), Linzi Day (e)
10  Seeing Red in Gretna Green (#5 Midlife Recorder), Linzi Day (e)
9    House Party in Gretna Green (#4 Midlife Recorder), Linzi Day (e)*
8    Ties that Bond in Gretna Green (#3 Midlife Recorder), Linzi Day (e)
7    Painting the Blues in Gretna Green (#2 Midlife Recorder), Linzi Day (e)
6    Midlife in Gretna Green (#1 Midlife Recorder), Linzi Day (e)
5    The Goblin Emperor, Katherine Addison (Author), Kyle McCarley (Narrator) re-re-re&c-read (audio)
4    The House in the Cerulean Sea,  TJ Klune (e)
3    A Gentleman in Search of a Wife (Lord Julian #5) Grace Burrowes (e)
2    A Gentleman in Pursuit of the Truth (Lord Julian #4) Grace Burrowes (e)
1    A Gentleman in Challenging Circumstances (Lord Julian #3) Grace Burrowes (e)

_____
*Note: The list has been corrected. I did not realize that the Gretna Green novella was part of the main path, rather than a pleasant discursion, and my numbering was off. All fixed now.


rydra_wong: Lee Miller photo showing two women wearing metal fire masks in England during WWII. (Default)
rydra_wong ([personal profile] rydra_wong) wrote2025-10-20 09:37 am

I now have basically no internet or phone reception at home

Except very occasionally if I can locate a spot that currently has reception.

So while that's going on, replies to anything may be delayed, but I'm reading when I can and distractions are still very much appreciated.

ETA: may now be fixed, I am deep in spoon debt and would like to be allowed to falldowngoboomnow.
yuuago: (B5 - Londo - Working)
yuuago ([personal profile] yuuago) wrote2025-10-19 10:33 pm
Entry tags:

(no subject)

I found myself thinking that I really miss NaNoWriMo. It's not so much a Thing any more for various reasons; a lot of people have tried to cook up replacements, but of course it's not quite the same.

There used to be in-person write-ins here in Fort Mac, and they were pretty fun while they lasted. Bummer that they aren't a thing any more. (Not that I'm going to put the effort into starting that up again.)

:Va I was kind of thinking about making a Writing Goal for November, but I think it might be more achievable to aim for finishing at least one thing, rather than a specific wordcount. We'll see!
ysabetwordsmith: Damask smiling over their shoulder (polychrome)
ysabetwordsmith ([personal profile] ysabetwordsmith) wrote2025-10-19 11:19 pm

Poem: "The First Swath Cut by the Scythe"

This poem was written outside the regular prompt calls. It fills the "scythe" square in my 8-1-25 card for the Discworld Bingo fest. This poem has been sponsored by a pool with [personal profile] fuzzyred and [personal profile] mama_kestrel. It belongs to the Shiv thread of the Polychrome Heroics series.

Read more... )