Thursday, October 21, 2021

Miracle of Stephan Crane

He popped out of nothing, fully formed almost and then wrote Red Badge of Courage in which he trulyimagined the civil war in a way tyhat resonated with its soldiers and victims.  For which we are thankful.

He has informed his successors including Hemingway.  Yet he only lived thirty years or so and was struck down by tuberculous.  

We continue to forget how precarious life was until the breakthroughs of  the first decades of the Twentieth century.  Then it took decades to get shit separated from our water supply.  That remains a work in progress even today in the poorer spots and sustains a market for bottled water.

The Miracle of Stephen Crane

Born after the Civil War, he turned himself into its most powerful witness—and modernized the American novel.

By Adam GopnikOctober 18, 2021

The battles in Crane’s “The Red Badge of Courage” feel like surrealist nightmares in which no one is master of his fate.Illustration by John Gall; Source photographs courtesy Library of Congress

Paul Auster’s “Burning Boy: The Life and Work of Stephen Crane” (Holt) is a labor of love of a kind rare in contemporary letters. A detailed, nearly eight-hundred-page account of the brief life of the author of “The Red Badge of Courage” and “The Blue Hotel,” augmented by readings of his work, and a compendium of contemporary reactions to it, it seems motivated purely by a devotion to Crane’s writing. Usually, when a well-established writer turns to an earlier, overlooked exemplar, there is an element of self-approval through implied genealogy: “Meet the parents!” is what the writer is really saying. And so we get John Updike on William Dean Howells, extolling the virtues of charm and middle-range realism, or Gore Vidal on H. L. Mencken, praising an inheritance of bile alleviated by humor. Indeed, Crane got this kind of homage in a brief critical life from the poet John Berryman in the nineteen-fifties, a heavily Freudian interpretation in which Berryman was obviously identifying a precedent for his own cryptic American poetic vernacular in Crane’s verse collections “The Black Riders” and “War Is Kind.”

But Auster, voluminous in output and long-breathed in his sentences, would seem to have little in common with the terse, hard-bitten Crane. A postmodern luxuriance of reference and a plurality of literary manners is central to Auster’s own writing; in this book, the opening pages alone offer a list of some seventy-five inventions of Crane’s time. The quotations from Crane’s harsh, haiku-like poems spit out from Auster’s gently loquacious pages in unmissable disjunction. No, Auster plainly loves Crane—and wants the reader to—for Crane’s own far-from-sweet sake.

And Auster is right: Crane counts. Everything that appeared innovative in writing which came out a generation later is present in his “Maggie: A Girl of the Streets” (1893) and “The Red Badge of Courage” (1895). The tone of taciturn minimalism that Hemingway seemed to discover only after the Great War—with its roots in newspaper reporting, its deliberate amputation of overt editorializing, its belief that sensual detail is itself sufficient to make all the moral points worth making—is fully achieved in Crane’s work. So is the embrace of an unembarrassed sexual realism in “Maggie,” which preceded Dreiser’s “Sister Carrie” by almost a decade.

How did he get to be so good so young? Crane was born in Newark in 1871, the fourteenth child of a Methodist minister and his politically minded, temperance-crusader wife. Early in the book, Auster provides, alongside those inventions, a roll call of American sins from the period of Crane’s youth: Wounded Knee, the demise of Reconstruction, and so on—all of which, however grievous, happened far from the Crane habitat. The book comes fully to life when it evokes the fabric of the Crane family in New Jersey. The family was intimately entangled in the great and liberating crusade for women’s suffrage, which was also tied to the notably misguided crusade for prohibition. Crane lived in a world of brutal poverty—and also one of expanded cultural possibilities that made possible his avant-garde practice, and his moral realism.

By the age of twenty, Crane was a reporter. This role explains much of the way he wrote and what he wrote. He began writing for a news bureau in Asbury Park, which was already a beach resort of the middle classes, and he immediately sprang onto the page sounding like himself. The tone of eighteen-nineties newspapering—stinging, light, a little insolent, with editorial ponderings left to the editorial page—was very much his, as was the piling on of detail, the gift for unforced scene painting, the comically memorable final image pulling an episode together. From an early dispatch:

All sorts and conditions of men are to be seen on the board walk. There is the sharp, keen-looking New-York business man, the long and lank Jersey farmer, the dark-skinned sons of India, the self-possessed Chinaman, the black-haired Southerner and the man with the big hat from “the wild and wooly plains” of the West. . . . The stock brokers gather in little groups on the broad plaza and discuss the prospective rise and fall of stocks; the pretty girl, resplendent in her finest gown, walks up and down within a few feet of the surging billows and chatters away with the college youth. . . . [They] chew gum together in time to the beating of the waves upon the sandy beach.

The passage from reporter to novelist (and poet) was in some ways the dominant trajectory of American writing then, when there was no Iowa Writers’ Workshop or much in the way of publisher’s advances. You wrote for a paper and hoped to sell a book. The newspaper’s disdain for fancy talk or empty platitudes was every bit as effective in paring down your prose and making you care most about the elemental particulars as any course in Flaubert. It was from this background that Crane wrote “Maggie: A Girl of the Streets.” It is not as good a novel as we’d like it to be, given its prescience in American literature for austere realism. The story of a decent girl forced into prostitution by poverty, it is striking for the complete absence of sentimentality about either the protagonist or her circumstances: Maggie has a heart not so much of gold as of iron. What is most memorable in the book now is the talk, and Crane’s way of placing pungent, broken dialogue against a serenely descriptive background. No publisher would touch it, and, when Crane self-published it, hardly any readers would, either.

Fortunately, there was a significant exception to the wave of indifference: William Dean Howells, the good guy of American letters in his day, whose nearly infallible tuning fork for writing, which had allowed him to appreciate Emily Dickinson before almost anyone else did, also enabled him to respond to Crane. (Though only after Crane had given him a second nudge to read it. Eminent literary people want to read the work of the young and coming; they just need to be reminded that it was sent to them two months ago.)

Overnight, Crane had as a literary mentor someone who was both broadly acceptable to middlebrow readers and acutely attuned to the avant-garde. Howells was as much protector as mentor, though it seems entirely plausible that, as one critic maintained, he was the first to read Dickinson’s poetry to Crane. The exposure helped liberate his own poems. If “Maggie” is amazing, in its way, it doesn’t touch the poetry in the collection “The Black Riders,” from 1895, which reads like a collaboration between Dickinson and a streetwalker—grim materials with ecstatic measures. As Berryman saw, it is hair-raising in the modernity of its diction and the death’s-head grin of its attitudes:

I saw a creature, naked, bestial,
Who, squatting upon the ground,
Held his heart in his hands,
And ate of it.
I said, “Is it good, friend?”
“It is bitter—bitter,” he answered;
“But I like it
“Because it is bitter,
“And because it is my heart.”

Certainly nothing in “Maggie” suggested the scale of what Crane pulled off in “Red Badge” only two years later. It’s the story of a teen-age boy, of his immersion and panic in battle, during the Civil War, and of his achievement of the “red badge”—a wound, though thankfully not a fatal one. “Red Badge” is one of the great American acts of originality; and if Auster is right that it has largely vanished from the high-school curriculum, its exile is hard to explain, given that it crosses no pieties, offends no taboos, and steps on no obviously inflamed corn. It is relentlessly apolitical, in a way that, as many critics have remarked, removes the reasons for the war from the war. It’s a work of sheer pointillist sensuality and violence: no causes, no purposes, no justifications—just a stream of consciousness of fear and, in the end, deliverance through a kind of courage that is indistinguishable from insanity.

“But after this stage we’re planning to raise him without technology.”

But that’s what gives it credibility as a work of human imagination: teen-age boys set down in a universe of limitless boredom suddenly interrupted by hideous violence and omnipresent death would not, in truth, think of the cause but of their own survival, seeking only the implicit approval of their fellow-soldiers. “Red Badge” is not about war; it is about battle. Soldiers fight and die so they don’t let down the other men who are in the line with them. One of the miracles of American fiction is that Crane somehow imagined all this, and then faithfully reported his imagination as though it had happened. What’s astonishing is not simply that he could imagine battle but that he could so keenly imagine the details of exhaustion, tedium, and routines entirely unknown to him:

The men had begun to count the miles upon their fingers, and they grew tired. “Sore feet an’ damned short rations, that’s all,” said the loud soldier. There was perspiration and grumblings. After a time they began to shed their knapsacks. Some tossed them unconcernedly down; others hid them carefully, asserting their plans to return for them at some convenient time. Men extricated themselves from thick shirts. Presently few carried anything but their necessary clothing, blankets, haversacks, canteens, and arms and ammunition. “You can now eat and shoot,” said the tall soldier to the youth. “That’s all you want to do.”

There was a sudden change from the ponderous infantry of theory to the light and speedy infantry of practice. The regiment, relieved of a burden, received a new impetus. But there was much loss of valuable knapsacks, and, on the whole, very good shirts. . . . Presently the army again sat down to think. The odor of the peaceful pines was in the men’s nostrils. The sound of monotonous axe blows rang through the forest, and the insects, nodding upon their perches, crooned like old women.

It was Crane, more than any other novelist, who invented the American stoical sound. Edmund Wilson, in “Patriotic Gore” (1962), saw this new tone, with its impassive gestures and tight-lipped, laconic ambiguities, as a broader effect of the Civil War on American literature. The only answer to the nihilism of war is a neutrality of diction, with rage vibrating just underneath. Hemingway wrote of the Great War, in “A Farewell to Arms,” almost in homage to what Crane had written of the Civil War.

How did Crane conjure it all? Auster dutifully pulls out the memoirs and historical sources that Crane had likely read. But the novel really seems to have been a case of a first-class imagination going to work on what had become all-pervasive material. The Civil War and its warriors were everywhere; when Crane went to Cuba to cover the Spanish-American War, in 1898, many of the leaders of the American troops were Civil War officers, including some Confederates.

Auster is often sharp-eyed and revealing about the details of Crane’s writing, as when he points out how much Crane’s tone of serene omniscience depends on the passive construction of his sentences. But when he implies that Crane is original because he summons up interior experience in the guise of exterior experience—makes a psychology by inspecting a perceptual field—he is a little wide of the mark. This is, after all, simply a description of what good writing does: Homer and Virgil writing on war were doing it, too. (We are inside Odysseus’ head, then out on the Trojan plain. We visit motive, then get blood.) What makes Crane remarkable is not that he rendered things felt as things seen but that he could report with such meticulous attention on things that were felt and seen only in his imagination. Again and again in his novel, the writing has the eerie, hyperintense credibility of remembered trauma—not just of something known but of something that, in its mundane horror, the narrator finds impossible to forget:

The men dropped here and there like bundles. The captain of the youth’s company had been killed in an early part of the action. His body lay stretched out in the position of a tired man resting, but upon his face there was an astonished and sorrowful look, as if he thought some friend had done him an ill turn. The babbling man was grazed by a shot that made the blood stream widely down his face. He clapped both hands to his head. “Oh!” he said, and ran. Another grunted suddenly as if he had been struck by a club in the stomach. He sat down and gazed ruefully. In his eyes there was mute, indefinite reproach. Farther up the line, a man, standing behind a tree, had had his knee joint splintered by a ball. Immediately he had dropped his rifle and gripped the tree with both arms. And there he remained, clinging desperately and crying for assistance that he might withdraw his hold upon the tree.

The wounded man clinging desperately to the tree has the awkward, anti-dramatic quality of something known. “Red Badge” has this post-traumatic intensity throughout, but so do later stories, just as fictive, like “The Blue Hotel” and the unforgettable “The Five White Mice,” about a night of gambling in Mexico that almost turns to murder, where the sudden possibility of death hangs in the air, and on the page, in a way that isn’t just vivid but tangible. The ability not simply to imagine but to animate imagination is as rare a gift as the composer’s gift of melody, and, like that gift, it shows up early or it doesn’t show up at all. Among American writers, perhaps only Salinger had the same precocity, the same hard-edged clarity of apprehension, and “The Catcher in the Rye,” another instantly famous novel about an adolescent imagination, shares Crane’s uncanny vividness. Rereading both, one is shocked by how small all the descriptive touches are; those ducks on the Central Park Pond are merely mentioned, not seen. Crane achieves this effect when he juxtaposes the nervous vernacular of a know-it-all soldier against his calm pastoral prose:

Many of the men engaged in a spirited debate. One outlined in a peculiarly lucid manner all the plans of the commanding general. He was opposed by men who advocated that there were other plans of campaign. They clamored at each other, numbers making futile bids for the popular attention. Meanwhile, the soldier who had fetched the rumor bustled about with much importance. He was continually assailed by questions.

“What’s up, Jim?”

“Th’ army’s goin’ t’ move.”

“Ah, what yeh talkin’ about? How yeh know it is?”

“Well, yeh kin b’lieve me er not, jest as yeh like. I don’t care a hang.”

There was much food for thought in the manner in which he replied. He came near to convincing them by disdaining to produce proofs. They grew much excited over it.

The impulse of Crane’s fiction is strictly realist and reportorial: the battle scenes in “Red Badge” feel like nightmares out of a surrealist imagination, with an excision of explanation and a simultaneity of effects, because that is what battles must be like. The result is almost mythological in feeling, and mythological in the strict Greek sense that everything seems foreordained, with no one ever master of his fate. We live and die by chance and fortune. This symbolic, myth-seeking quality of Crane’s writing gives it an immediacy that makes other American realists, of Dreiser’s grimmer, patient kind, seem merely dusty.

Auster calls Crane’s work “cinematic,” though perhaps it is closer to the truth to say that feature films were derivatively novelistic, Crane and Dickens providing the best model at hand for vivid storytelling. John Huston’s 1951 production of “Red Badge”—itself the subject of a masterpiece of reporting, Lillian Ross’s “Picture,” in this magazine—is both a good movie and faithful to the text, perhaps a good movie because it is faithful to the text. Intelligently cast with young veterans of the war just ended, including the Medal of Honor winner Audie Murphy, it evokes exactly the trembling confusion of non-heroic adolescents thrown into a slaughterhouse which Crane sought in his prose.

Crane’s ascension to celebrity was immediate. Auster produces some hostile notices—every writer has one place that just hates him, and Crane’s was the New York Tribune—but they are more than balanced by the effusive ones. (What damages writers is a completely hostile or uncomprehending press, like the reception that Melville got for “Pierre” and that helped clam him up.) Talked of and written up, Crane found that everyone wanted to be his employer or his friend, including William Randolph Hearst, who was just starting his reign at the New York Journal, and Teddy Roosevelt, then the commissioner of the New York City Police. There was even a testimonial dinner held for him in Buffalo, late in 1895, where everyone got drunk.

Then it all went wrong. Crane must have hoped that “Maggie” would be seen as a work of detached research, but he did patronize women “of the streets.” He didn’t patronize them in the other sense—he treated them as women marginalized by society, who nonetheless had the opportunity for a range of sexual experience, and with it a limited sort of emancipation, that respectable women were unhappily denied. He lived with one, Amy, who was less a sex worker than a woman who worked out her sexual decisions for herself, having a lively series of attachments to men other than Crane, even as she loved him. It was an arrangement that worked until it didn’t.

One night in 1896, Crane was out reporting on nightlife in the Tenderloin—then the red-light district, in the West Thirties—in the company of two “chorus girls.” They were joined by a woman known as Dora Clark, who had previously been arrested for soliciting, and, while Crane was putting one of the chorus girls onto a trolley, a corrupt cop named Charles Becker arrested the other chorus girl, along with Dora Clark, for propositioning two passing men. Crane intervened on behalf of both women, insisting to Becker that he was the husband of the chorus girl. (“If it was necessary to avow a marriage to save a girl who is not a prostitute from being arrested as a prostitute, it must be done, though the man suffer eternally,” he explained later.) The next morning, in police court, he intervened on behalf of Dora Clark as well. “If I ever had a conviction in my life, I am convinced that she did not solicit those two men,” he later wrote.

At first, Crane was admired for his gallantry. “stephen crane as brave as his hero. showed the ‘badge of courage’ in a new york police court. boldly avowed he had been the escort of a tenderloin woman” was the headline in Hearst’s New York Journal. Then Becker was brought up on charges, and he brutally beat Dora Clark in retaliation. In the course of a hearing, Becker’s lawyer revealed that Crane had had a long-term, live-in affair with another “Tenderloin woman,” called Amy Huntington or Amy Leslie. To top it off, the police had raided his apartment and found an opium pipe. Crane had earlier done a remarkably fine job on a piece about opium smoking, though Auster is unsure whether Crane smoked the stuff. The vivid evocation of an opium high suggests that he did, but then he excelled at the vivid evocation of things that hadn’t happened to him. Either way, he did hang the opium pipe on the wall of his apartment, a trophy of his adventures.

The headlines altered overnight, as they will. “janitor confessed that the novelist lived with a tenderloin girl an opium smoking episode” was the headline in Pulitzer’s gleeful New York World. The brave defender of embattled womanhood, not to mention the bright hope of American literature, suddenly became the guy who kept a fast woman in a Chelsea residence and smoked dope. Teddy Roosevelt broke with him, and years afterward referred to him as a “man of bad character.”

The incident set the tone for much of Crane’s subsequent life: he did things that might have seemed crazily provocative with a certain kind of innocence, not expecting the world to punish him for the provocation. It is a character type not unknown among writers—the troublemaker who doesn’t know that he’s making trouble until the trouble arrives, who then wonders where all the trouble came from. Crane seems, on the surface, to have maintained his composure in the face of the scandal. In a letter to one of his brothers, he wrote, “You must always remember that your brother acted like a man of honor and a gentleman and you need not fear to hold your head up to anybody and defend his name.” But, as he noted elsewhere, “there is such a thing as a moral obligation arriving inopportunely.” Auster thinks the affair shook him badly, and doubtless it did. To further complicate things, Amy Leslie—whom Crane genuinely seems to have loved, addressing her as “My Blessed Girl” and “My own Sweetheart,” in one tender love letter after another—sued him for stealing five hundred and fifty dollars from her. (Auster supposes that much of this was money that Crane had received as royalties—it was a lot of money, and makes sense as a check from a publisher for a hit book—and promised, and then failed, to give to her.)

To add a note of grotesque comedy, which Auster addresses in an exquisitely intricate footnote, this Amy Leslie was easily confused with a more literary friend of Crane’s, also named Amy Leslie; for generations, Crane students were convinced that they were one and the same. The literary Amy, to the end of her life, was left strenuously protesting that she hadn’t been involved in the Tenderloin affair, to the smug skepticism of Crane scholars. “You can’t fight fate,” Crane’s implicit motto, ended up ensnaring her as well.

And not her alone. Auster, who is very good at picking out superb stuff from Crane’s mostly submerged journalism, includes a shiveringly cool account of the electric chair at Sing Sing, with a tour of the graveyard below, where the executed bodies were buried. “It is patient—patient as time,” Crane writes of the newly enthroned electric chair:

Even should its next stained and sallow prince be now a baby, playing with alphabet blocks near his mother’s feet, this chair will wait. It is unknown to his eyes as are the shadows of trees at night, and yet it towers over him, monstrous, implacable, infernal, his fate—this patient, comfortable chair.

Fate having its way, Crane’s nemesis, Charles Becker, was executed in that chair two decades after his run-in with Crane, for helping to arrange the murder of a gambler. He is still the only New York City policeman ever to be put to death.

The New York scandal helped propel Crane out of the city. He began a long period of wandering, most of it with his new and devoted common-law wife, Cora—a business-minded woman who once established what may have been a brothel, in Florida. Crane’s journey took several strange turns that commentators have found darkly exemplary of the plight of the American writer. He went to Greece, in 1897, to report on the Hellenic battles with the Turks, and then to Cuba, to cover the Spanish-American War, which his previous employer, Hearst, had helped start, and his current employer, Pulitzer, wanted readers to enjoy. The fame he had earned so young kept him busy with journalistic and newspaper jobs. As a writer who had shown an unprecedented mastery of writing about a war that he had never seen, he kept getting jobs reporting on wars that he could see, and ended up writing about them much less well.

His final years were largely spent in a leased country house in England, where, as the author of “Red Badge,” he was more celebrated by the British literary establishment than he had been by the American one, but still unable to make a steady living by his pen. Conrad became an intimate, and James referred to him as “that genius,” but it was H. G. Wells who most succinctly defined Crane’s contribution as a writer: “the expression in literary art of certain enormous repudiations.”

Crane never stopped writing, pursuing both journalism, with spasmodically interesting results, and poetry, in bursts of demonic energy. His second volume of poems, “War Is Kind,” is as good as his first and, again, eerily prescient. Crane learned in reporting what another generation of poets would learn only in the Great War:

Swift blazing flag of the regiment,
Eagle with crest of red and gold,
These men were born to drill and die.
Point for them the virtue of slaughter,
Make plain to them the excellence of killing
And a field where a thousand corpses lie.

Crane’s last months have always confounded scholars. In a way, they are as piteous as Keats’s last stay in Rome, with poor Crane dying of tuberculosis at a time when no one could cure it. He coughs up blood all over Auster’s final fifty pages. Yet he kept up what has always seemed to his admirers a heavy tread of partying, with amateur theatricals and New Year’s assemblages.

A. J. Liebling, in an acidic and entertaining commentary on Crane’s final days, published six decades ago in this magazine, insisted that he died, “unwillingly, of the cause most common among American middle-class males—anxiety about money.” Liebling put together the incompetence of turn-of-the-century doctors with the brutality of turn-of-the-century publishers, two of his favorite hobby-horses, and acquitted Crane of the self-destructive behavior often attributed to him.

Crane was as famous as any young writer has ever been, but it didn’t make him rich. The jobs he could get, like writing for Hearst and Pulitzer, paid well but depended on his being out there, writing. No one lived on advances. The one moneymaking scheme that Crane pursued was the one in which a writer, having written a popular thing, is asked to write something else that bears a catty-cornered relation to it. So Crane, the author of a great novel about war, accepted a lucrative commission to write a magazine series called “Great Battles of the World”—a task for which he, hardly a historian, was ill-equipped.

There is something heroic in the desperate gaiety with which Crane and Cora insisted on living well until the end. Though Crane confided to his agent in America that he was “still fuzzy with money troubles,” Auster tells us that in England “not even their closest friends had any inkling of how hard up they were, and by spending more and more money they did not have, the couple affected a magnificent pose of nonchalance and well-being.” Then, long through the night, Crane would “lock himself in his small study over the porch,” sliding finished work under his door, for Cora to type a clean copy.

Really, the bacillus was to blame. Had Crane been healthy, he would have found a way to live and write. The famous sanatoriums of the era—Crane ended his life at one in Germany—had, at least, the virtue of sealing patients off from others, but the cruelty of the disease was that there was nothing to be done. Despite our own recent immersion in plague, we still have a hard time understanding how much the certain fatality of illness affected our immediate ancestors; Hemingway suffered in the war, but it was the Spanish influenza that made him acutely aware that death and suffering could not be turned off when wars ended.

There’s no fighting fate. The extreme stoicism of Crane’s vision, even without the resigned epicurean sensuality that lit up Hemingway’s, is what made it resonate for the “existential” generation, including Berryman. Most good writers try out many roles, put on many masks, adopt many voices, and leave it to biographers to point to the gaps between their act and their acts. A few make a fetish of not putting anything on. Crane was of that school, and, as much as he sits within the mainstream of writing, he is also among those American writers—Hunter Thompson and Ken Kesey come to mind—who deliberately sit outside it, going their own shocking way and sticking their tongue out at the pieties. (It may not be an accident that such writers tend to strike gold young and then get brassy.) Life is out to get you, and will. It’s far from the cheeriest of mottoes, but there was nothing false or showy about it. “To keep close to my honesty is my supreme ambition,” Crane wrote. “There is a sublime egotism in talking of honesty. I, however, do not say that I am honest. I merely say that I am as nearly honest as weak mental machinery will allow. This aim in life struck me as being the only thing worth while. A man is sure to fail at it, but there is something in the failure.”

Both the defiance and the defeatism are integral to Crane. He emerges from this book, as from his own, as the least phony great American writer who ever lived. Although he died with his talent only partly harvested, he left this life curiously unembittered, surprisingly serene. “I leave here gentle” were among his last words to Cora. He had eaten his own heart. ♦

Are covid “vaccines” giving people AIDS?

This will make you cry.  If this is true, we will see 2,000,000,000 die over the next five years.  The Vaccine was fully weaponized to induce AIDS in its victims.

That is also the reason that it is taking time to truly appear.  This is turning out to be a massive assault on all humanity.

I still have faith that most of this will be countered but no clear knowledge beyound the nature of this weaponization which i clued into a long time ago.

The other side told me that we will see 2,000,000,000 dead.  then i could not see how.  Now we know how and it is horrible.  We now have it all.

Are covid “vaccines” giving people AIDS? Immune system functions are dropping around 5% EACH WEEK in those who were vaccinated

Sunday, October 17, 2021 by: Ethan Huff

(Natural News) The latest data from the United Kingdom’s PHE Vaccine Surveillance Report suggests that people who have been “fully vaccinated” for the Wuhan coronavirus (Covid-19) are losing about five percent of their immune systems per week.

Doubly injected people between the ages of 40 and 70 have already lost about 40 percent of the immune system capacity from the moment they get injected. They then progressively lose more of it over time, with peak immune system loss for many expected to arrive by Christmas.

“If this continues then 30-50 year-olds will have 100% immune system degradation, zero viral defence by Christmas and all doubly vaccinated people over 30 will have lost their immune systems by March next year,” reports The Exposé.

There is no denying, based on the data, that fully vaccinated people now suffer from what appears to be acquired immunodeficiency syndrome, more popularly known as AIDS. Their immune systems are fading away, which many have been warning would be the case.

“People aged 40-69 have already lost 40% of their immune system capability and are losing it progressively at 3.3% to 6.4% per week,” The Exposé says.

Interestingly, the worst-off demographic is people aged 40-49, who are suffering total immune system loss in about nine weeks. The best-off group is younger people aged 18-29, who tend to last around 44 weeks.

Elderly people over the age of 80 last about 20 weeks, while the 50-59 age category only gets about 15 weeks. The other remaining age groups last anywhere from 12 to 25 weeks.

“Everybody over 30 will have lost 100% of their entire immune capability (for viruses and certain cancers) within 6 months,” warns The Exposé.

“30-50 year-olds will have lost it by Christmas. These people will then effectively have full blown acquired immunodeficiency syndrome and destroy the NHS (National Health Service).”

Getting a “booster” shot will only speed up the death process

It is not just that the jabs do not provide the claimed amount of protection against the Fauci Flu. The fact of the matter is that they provide no protection at all in the long term and actually destroy a person’s immune system.

“Pfizer originally claimed a 95% efficiency for their vaccine (calculated as in the last column above).

 The figures above indicate that their figures may well have been correct immediately after vaccination (the younger age groups have had the vaccine for the shortest time),” The Exposé explains.

“But the figures above also show that the vaccines do NOT merely lose efficiency over time down to zero efficiency, they progressively damage the immune system until a negative efficiency is realised. They presently leave anybody over 30 in a worse position than they were before vaccination.”

People who take the Biden “Booster” shots will only accelerate this process by adding even more immune-destroying chemicals to their bodies. The downward spiral will move even faster, in other words, the more shots a person gets.

“If we do nothing about this, it will only get much worse than we ever could have imagined,” wrote one Exposé commenter. “I, for one, appreciate all who have stood against and continue to stand against this tyranny.”

“The spike protein hijacks your mitochondria … forever,” wrote another. “Mitochondria is the heart of your immune system. Essentially, it takes over your immune system. The spike proteins are the bioweapons. Your own immune system becomes a deadly weapon. This is HIV on steroids.”

Others echoed these same sentiments, noting that it is painfully obvious what the agenda is. How anyone could argue with a straight face that this is all for “public health” remains a mystery.

The latest news about the Wuhan coronavirus (Covid-19) “vaccine” genocide can be found at

Unhappy with prices, ranchers look to build own meat plants

Long overdue of course.  This should see the rise of coop slaughter houses as well.  What is important though is that farmers need to own walk in coolers able to store their hung sides and packed produce as well.  This happens to be a significant time window that commercial operations struggle with and are better off capitalized by the producers who have alternative storage streams.

once the farmer has the cooler ,it is no trick to call in a meat cutter to process a side for sale at retail outlets including farmers market.  This accesses a much higher gross and establishes a close relation to consumers.

And why would a major retail outlet eschew the equivalent product provided by the local producers and even labeled by name.

The whole bulk production MEME is flawed and we all know it and we all want a better solution.  This gives the producer a much better margin and a better gross as well.


Unhappy with prices, ranchers look to build own meat plants

By SCOTT McFETRIDGEOctober 16, 2021

DES MOINES, Iowa (AP) — Like other ranchers across the country, Rusty Kemp for years grumbled about rock-bottom prices paid for the cattle he raised in central Nebraska, even as the cost of beef at grocery stores kept climbing.

He and his neighbors blamed it on consolidation in the beef industry stretching back to the 1970s that resulted in four companies slaughtering over 80% of the nation’s cattle, giving the processors more power to set prices while ranchers struggled to make a living. Federal data show that for every dollar spent on food, the share that went to ranchers and farmers dropped from 35 cents in the 1970s to 14 cents recently.

It led Kemp to launch an audacious plan: Raise more than $300 million from ranchers to build a plant themselves, putting their future in their own hands.

“We’ve been complaining about it for 30 years,” Kemp said. “It’s probably time somebody does something about it.”

Crews will start work this fall building the Sustainable Beef plant on nearly 400 acres near North Platte, Nebraska, and other groups are making similar surprising moves in Iowa, Idaho and Wisconsin. The enterprises will test whether it’s really possible to compete financially against an industry trend that has swept through American agriculture and that played a role in meat shortages during the coronavirus pandemic.


Automakers step up pace on electric vehicle battery plants

Facebook plans to hire 10,000 in Europe to build 'metaverse'

Bitcoin investing could get boost from exchange-traded fund

Industrial production falls 1.3% as effects from Ida linger

The move is well timed, as the U.S. Department of Agriculture is now taking a number of steps to encourage a more diverse supply in the beef industry.

Still, it’s hard to overstate the challenge, going up against huge, well-financed competitors that run highly efficient plants and can sell beef at prices that smaller operators will struggle to match.

The question is whether smaller plants can pay ranchers more and still make a profit themselves. An average 1,370-pound steer is worth about $1,630, but that value must be divided between the slaughterhouse, feed lot and the rancher, who typically bears the largest expense of raising the animal for more than a year.

David Briggs, the CEO of Sustainable Beef, acknowledged the difficulty but said his company’s investors remain confident.

“Cattle people are risk takers and they’re ready to take a risk,” Briggs said.

Consolidation of meatpacking started in the mid-1970s, with buyouts of smaller companies, mergers and a shift to much larger plants. Census data cited by the USDA shows that the number of livestock slaughter plants declined from 2,590 in 1977 to 1,387 in 1992. And big processors gradually dominated, going from handling only 12% of cattle in 1977 to 65% by 1997.

Currently four companies — Cargill, JBS, Tyson Foods and National Beef Packing — control over 80% of the U.S. beef market thanks to cattle slaughtered at 24 plants. That concentration became problematic when the coronavirus infected workers, slowing and even closing some of the massive plants, and a cyberattack last summer briefly forced a shutdown of JBS plants until the company paid an $11 million ransom.

The Biden administration has largely blamed declining competition for a 14% increase in beef prices from December 2020 to August. Since 2016, the wholesale value of beef and profits to the largest processors has steadily increased while prices paid to ranchers have barely budged.

The backers of the planned new plants have no intention of replacing the giant slaughterhouses, such as a JBS plant in Grand Island, Nebraska, that processes about 6,000 cattle daily — four times what the proposed North Platte plant would handle.

However, they say they will have important advantages, including more modern equipment and, they hope, less employee turnover thanks to slightly higher pay of more than $50,000 annually plus benefits along with more favorable work schedules. The new Midwest plants are also counting on closer relationships with ranchers, encouraging them to invest in the plants, to share in the profits.

The companies would market their beef both domestically and internationally as being of higher quality than meat processed at larger plants.

Chad Tentinger, who is leading efforts to build a Cattlemen’s Heritage plant near Council Bluffs, Iowa, said he thinks smaller plants were profitable even back to the 1970s but that owners shifted to bigger plants in hopes of increasing profits.

Now, he said, “We want to revolutionize the plant and make it an attractive place to work.”

Besides paying ranchers more and providing dividends to those who own shares, the hope is that their success will spur more plants to open, and the new competitors will add openness to cattle markets.

Derrell Peel, an agricultural economist at Oklahoma State University, said he hopes they’re right, but noted that research shows even a 30% reduction in a plant’s size will make it far less efficient, meaning higher costs to slaughter each animal.

Unless smaller plants can keep expenses down, they will need to find customers who will pay more for their beef, or manage with a lower profit margin than the big companies.

“We have these very large plants because they’re extremely efficient,” Peel said.

According to the North American Meat Institute, a trade group that includes large and mid-size plants, the biggest challenge will be the shortage of workers in the industry.

It’s unfair to blame the big companies and consolidation for the industry’s problems, said Tyson Fresh Meats group president Shane Miller.

“Many processors, including Tyson, are not able to run their facilities at capacity in spite of ample cattle supply,” Miller told a U.S. Senate committee in July. “This is not by choice: Despite our average wage and benefits of $22 per hour, there are simply not enough workers to fill our plants.”

The proposed new plants come as the USDA is trying to increase the supply chain. The agency has dedicated $650 million toward funding mid-size and small meat and poultry plants and $100 million in loan guarantees for such plants. Also planned are new rules to label meat as a U.S. product to differentiate it from meat raised in other countries.

“We’re trying to support new investment and policies that are going to diversify and address that underlying problem of concentration,” said Andy Green, a USDA senior adviser for fair and competitive markets.


Colin Powell, First Black Secretary of State, Dies at 84 After Vaccination

How they get off pretending that this is not a VAX kill escapes me.  He started out with a suppressed immune system. Literately the last person who should have had the vaccine.

He had a remarkable carreer and wonderfully established a convincing role model for all non white kids who delude themselves into thinking they lack all the advantages our civilization provides.

Yes, you can do it and do it everywhere.  I grew up with my childhood reading including plenty of Horatio Alger stories.  We need to get back to all that.  every kid needs that dream gto show him a way out of his circumstances.

Colin Powell, First Black Secretary of State, Dies of Covid-19 at 84

The decorated general broke racial barriers in the U.S. military but attracted criticism for his part in paving the way for the Iraq War

Daily Correspondent

October 18, 2021 5:34 p.m.

Detail of Ronald N. Sherr's General Colin Powell, 2012, oil on canvas National Portrait Gallery, Smithsonian Institution / Supported by a grant from the Donald W. Reynolds Foundation and by the Marc Pachter Commissioning Fund

Colin L. Powell, the American statesman and soldier whose legacy of public service was marred by his role in launching the Iraq War, died of complications from Covid-19 on Monday.

Powell’s family announced his death on Facebook, adding that the 84-year-old was fully vaccinated but contracted a breakthrough case of the virus. His immune system had been weakened by treatment for multiple myeloma, a cancer that affects the body’s plasma cells, report Robert Burns, Eric Tucker and Eileen Putman for the Associated Press (AP).

“Colin embodied the highest ideals of both warrior and diplomat,” said President Joe Biden in a White House statement that described Powell as a “dear friend” and “trusted confidant.”

Powell died on October 18, 2021, of complications from Covid-19. He was 84. National Portrait Gallery, Smithsonian Institution / Supported by a grant from the Donald W. Reynolds Foundation and by the Marc Pachter Commissioning Fund

A decorated general and persuasive diplomat, Powell was the first Black American to hold the positions of national security adviser, chairman of the Joint Chiefs of Staff and secretary of state. After the 9/11 attacks, he helped pave the way for the United States’ invasion of Iraq—a role that he came to view as a source of “lifelong regret,” writes Eric Schmitt for the New York Times.

On February 5, 2003, Powell, then serving as George W. Bush’s secretary of state, made an influential speech to the United Nations (U.N.) Security Council, drawing on embellished and misleading reports from the CIA. Despite his own reservations about the possible costs of war, Powell claimed that Iraqi dictator Saddam Hussein was harboring weapons of mass destruction and posed an imminent threat to the U.S.

“What we’re giving you are facts and conclusions based on solid intelligence,” Powell said.

In reality, many of the general’s own employees had previously flagged claims in the speech as “weak,” “not credible” or “highly questionable,” per CIA employees had also failed to communicate a number of serious concerns to Powell, allowing his speech to go forward on the assumption that other U.S. leaders were intent on invading Iraq no matter what, as Robert Draper reported for the New York Times magazine last year.

Powell’s comments nevertheless galvanized many Americans to support the invasion, which took place just six weeks later. The Iraq War lasted until 2011, and its aftershocks continue to wreak havoc on the Middle Eastern country and its people today: According to Brown University’s Costs of War project, direct violence stemming from the U.S. invasion of Iraq in 2003 has killed between 184,382 and 207,156 Iraqi civilians to date. U.S.-led violence also displaced millions of refugees and damaged systems that provide food, healthcare and drinking water, meaning that the actual death toll may surpass one million Iraqis.

Powell would later admit regret for throwing his substantial political capital behind the conflict. The U.N. speech “was by no means my first, but it was one of my most momentous failures, the one with the widest-ranging impact,” the politician wrote in his 2012 memoir, It Worked for Me.

He added, “The event will earn a prominent paragraph in my obituary.”

For some onlookers, Powell’s involvement in the Iraq War severely damaged the general’s positive reputation as a political moderate, a skilled architect of war and a leader of “unassailable credibility,” per the New York Times magazine.

Born on April 5, 1937, to Jamaican immigrant parents, Powell grew up in the South Bronx and attended City College, where he joined the Reserve Officers’ Training Corps (ROTC). He spent two tours in Vietnam during his 35-year career as a professional soldier.

The decorated veteran eventually rose to the highest echelons of the military, breaking racial barriers as the first Black man to hold numerous prestigious government titles. As chairman of the Joint Chiefs of Staff, Powell guided the U.S. invasion of Panama in 1989 and the U.S. invasion of Kuwait during the Gulf War of 1990 and 1991. He famously summed up his approach to the Gulf War as such: “Our strategy in going after this army is very simple. First, we’re going to cut it off, and then we’re going to kill it.”

Powell was known for stating that the U.S. should only engage in military intervention when it has “precise goals and clear public support,” the Washington Post reported in 2001. This philosophy came to be labeled the Powell Doctrine.

Speaking with Smithsonian Secretary Lonnie G. Bunch III in a 2016 oral history interview, Powell described himself as a “reluctant general.” He said his namesake doctrine contends that leaders should “try to solve [conflict] politically and diplomatically. But if war is necessary, if you’ve got to go to war, then man, do it and do it fast. Do it with decisive force.”

Most importantly, Powell added, “The Powell Doctrine simply says, ‘Make sure you know what you’re getting into.’”

Powell's official portrait as secretary of state Public domain via Wikimedia Commons

By the time of his retirement from the military in 1993, Powell’s gift for public speaking had made him “the most popular public figure in America,” according to the Times. He debated running for president or vice president as a Republican, and at one point was considered the “leading contender” to become the first Black U.S. president, writes Devan Cole for CNN.

Though he eventually decided against a political run, Powell would later surprise many by supporting Democrat Barack Obama in his 2008 presidential campaign.

“I think we need a generational change,” Powell said at the time.

After the September 11 attacks, Powell worked (and often disagreed) with hawkish Vice President Dick Cheney and Defense Secretary Donald Rumsfeld as the leaders shaped U.S.-led campaigns in Afghanistan and Iraq. Brown University estimates that this so-called “War on Terror,” including related violence in Pakistan and Syria, has killed more than 900,000 and displaced more than 38 million to date.

In 1997, Powell served as founding chair of America’s Promise, a nonprofit organization benefitting at-risk children across the country. He was also a founding donor and council member of the Smithsonian’s National Museum of African American History and Culture (NMAAHC), which opened its doors in 2016. That same year, the Smithsonian’s National Museum of American History awarded Powell its “Great Americans” medal in recognition of his “lifetime contributions that embody American ideals and ideas.”

Last month, the statesman helped NMAAHC celebrate its five-year anniversary.

“[Powell] was always personable and welcoming, and we remain inspired by his achievements, brilliance and dedication to the future of this country,” writes NMAAHC’s director, Kevin Young, on Twitter. “Our thoughts go out to his family and loved ones.”

Wednesday, October 20, 2021

Pre-Columbian America Wasn't Exactly a Paradise of Freedom

excellent review of Indigenous statehood which naturally turns out to be a sophicticated as European practise.  We have been led to ignore all this and that is a mistake.

Tribute came in the form of corn and that is no surprise.    More surprising it turns out that copper was a monopoly of the aristocracy and this has been dismissed by historians.  Certainly this was true in europe and during the Bfronze Age in the Americas, had to also be true.  This is confirmation that the practise never died out.

It is also obvious that copper was used to produce weapons for the leaders.  Been valuable, they rarely ever lost a piece.

Pre-Columbian America Wasn't Exactly a Paradise of Freedom

10/14/2021Daniella Bassi

The story of European colonization of the Americas is popularly understood as the conquest of American Indians—the end of natives’ control of the land and the beginning of their subjugation. The contingencies of indigenous agency and geopolitics mean that the reality is much messier, as historians have been steadily revealing for decades, but this interpretation still circulates.

One possible reason for its longevity is the still common impression that Indians all roamed freely over the land, lacking a conception of private property and existing in a state of virtual harmony when the first agents of European states made contact in the late fifteenth, sixteenth, and seventeenth centuries. A necessary corollary to this image of precontact native freedom is the implication that these societies had no state or barely had one to speak of and that the suffocating stays of political power were as novel to them as the diseases the strangers carried with them.

Certainly, many indigenous societies were self-governing—consensual chiefdoms in which leaders were unable to use force or to act without consulting their entire community. The chiefdoms of Hudson Valley societies such as the Mahicans (a.k.a. Mohicans) are a case in point. In these kinds of societies dissatisfied tribesmen could even desert a chief without fear of retribution. Other groups such as Inuit lacked chiefs entirely, though talented hunters’ and elders’ opinions held special weight when community members made decisions.

But it must not be forgotten that large centralized polities also existed in the Americas prior to European contact. These had the basic trappings of a state: a centralized authority’s superimposition of property claims (and accompanying authority) over the existing property rights of others through force and intimidation, and exploitative economic relations in which this self-proclaimed authority extracts wealth from others by force or intimidation rather than voluntary exchange.

For example, the Powhatan chiefdom of the Chesapeake consisted of a paramount chief (the mamanatowick), the chiefs (werowances, or “commanders”) of subject tribes under him, the werowances of satellite towns, and commoners. Unsurprisingly, the mamanatowick and the werowances (who all could be male or female) alike inherited their offices and had a symbiotic relationship with the influential clerical class, who were consulted in matters of foreign policy and crime.1

Powhatan, who was the mamanatowick in the days of Jamestown, inherited the paramount chieftaincy and six chiefdoms (Powhatan, Arrohateck, Appamattuck, Pamunkey, Mattaponi, and Chiskiack) from his parents between the 1550s and 1580s. He then expanded his rule: he conquered the Kecoughtans (he had his goons kill their chief), exterminated the Chesapeakes (had his goons massacre most of the people, who would not submit), and by 1607, when John Smith made landfall in the name of the English state, had subjugated all the peoples of the Chesapeake coastal plain except for the Chickahominies.2

Tribute payments of food and other valuables went up the hierarchy, extracted by both the mamanatowick and the werowances. The only exception was copper, which Powhatan monopolized and used to pay his werowances for their military services—that is, for them to kill others, stare down those who remained, and thereby keep the great chief in power. He also made gifts of copper to others, buying support and perhaps submission.3

The tribute payments were involuntary—there is even record of people hiding food in underground storage pits in addition to the aboveground buildings specifically designated as storehouses, possibly to keep more of their wealth. As contemporaneous observer William Strachey noted:

Their corn and (indeed) their copper, hatchetts, howses, beades, perle and most things with them of value, according to their own estymacion, they hide, one from the knowledge of another, in the grownd within the woods, and so keepe them all the yeare, or untill they have fitt use for them … and when they take them forth, they scarse make their women privie to the storehowse.4

The Narragansetts, Massachusetts, Wampanoags, and Pequots of southern New England had a similar political economy. Here power was concentrated in sachems, who also inherited authority, and local elites. There were layers of sachemships, with subordinate sachems paying tribute to the dominant sachem with the wealth created by their people. Internal tribute was also levied on communities, which enriched the sachem and allowed him to make war on other peoples to expand his dominion.5 As Plymouth colonist Edward Winslow explained:

Every sachim knoweth how far … his own country extendeth; and that is his own proper inheritance…. In this circuit whosoever hunteth, if [his men] kill any venison, bring him his fee…. Once a year the pnieses [warrior elite] use to provoke the people to bestow much corn on the sachim.6

The Nahuas of central Mexico are an even better example of people living under a pre-European state. The Nahuas consisted of a variety of Nahuatl-speaking nations among whom the Aztecs (sometimes called Mexicas) were dominant when the agents of the Spanish state marched in in 1519. This complex society in the early sixteenth century was organized into a network of kingdoms or city-states. Each kingdom (altepetl, pl. altepeme) was inhabited by a specific Nahua group, ruled by a tlatoani, and had “ranked classes of warrior-nobles, priests, commoners, and slaves.” Each altepetl was subdivided into districts and neighborhoods, called calpulli.7

The Aztec elite extracted tribute from conquered Nahuas and from their own local peasantry, and each altepetl in turn demanded tribute from altepeme under its control (if any) and from its own commoners. Tribute was collected by the officials of the local calpulli. A variety of special lands set aside for the support of the clergy and incumbent politicians, as well as for the personal benefit of nobles, were worked by slaves and by commoners under temporary forced, corvée-like labor, which was part of their tribute burden. Historian Allen Greer describes the Aztec Empire as “an engine of tribute exaction.” Much like the contemporary states of today, when the empire subsumed a new kingdom, they sometimes installed a puppet tlatoani to keep the gravy flowing smoothly.8

Each person was carefully accounted for: local officials conducted censuses for the altepeme that tracked the population of each calpulli down to the household (calli) level. Each household’s head and members (along with their age, sex, and civic status) were detailed, and their specific landholdings were surveyed, mapped, and the dimensions and surface area noted. Although each household held a specific piece of land, the family did so “under the authority and eminent domain of the local calpulli and its officials,” their property rights superseded by the claims of their state. Calpulli land could not be alienated outside the kin group and was subject to tribute for the local or Aztec government in proportion to its size—no wonder those nifty maps were in the census! At least purchased land could be sold, and it was not subject to tribute, something that cannot be said about most land purchased in the US today. Each altepetl could also handle its own internal affairs without interference from above as long as everyone forked over their “protection,” or, better, leave-me-alone money.9

The moral of the story here is that we cannot forget the polygenic character of the state in telling the story of the unending struggle between freedom and subjection across the world. Just as different ancient societies developed agriculture on their own, the institution of the state surfaced independently in different parts of the ancient world, continuing on its ruinous trajectory from there. To tell the story of the Americas as the violent “pacifying” and corralling of free indigenous peoples by white outsiders is to erase the long history of statism in many places. Sadly, statism had plagued many people for a long time when the agents of European states arrived, many with the express aim of aiding their states in continuing their reign of pillage and oppression in a new land. After all, using aggression to get ahead in life is an age-old tactic.
  • 1.Helen C. Rountree, Pocahontas’s People: The Powhatan Indians of Virginia through Four Centuries (Norman: University of Oklahoma Press, 1990), pp. 9–11.
  • 2.Rountree, Pocahontas’s People, pp. 10–11, 25–27.
  • 3.Rountree, Pocahontas’s People, pp. 8, 9.
  • 4.William Strachey, The Historie of Travaile into Virginia Britannia: Expressing the Cosmographie and Comodities of the Country, Togither with the Manners and Customes of the People, ed. R.H. Major (1612; London: Hakluyt Society, 1849), p. 113.
  • 5.Allen Greer, Property and Dispossession: Natives, Empires, and Land in Early Modern North America (Cambridge: Cambridge University Press, 2018), pp. 40–42.
  • 6.Edward Winslow, “Good Newes from New England: Or a True Relation of Things Very Remarkable at the Plantation of Plimoth in New-England” [1624], in Chronicles of the Pilgrim Fathers of the Colony of Plymouth from 1602 to 1625, ed. Alexander Young (Boston: C.C. Little and J. Brown, 1844), pp. 361–62, quoted in Greer, Property and Dispossession, p. 41 (“warrior elite” gloss by Greer).
  • 7.Greer, Property and Dispossession, pp. 30–31, quote on p. 30.
  • 8.Greer, Property and Dispossession, pp. 30–31, 33–34, quote on p. 31.
  • 9.Greer, Property and Dispossession, pp. 323, 34, 36, 31, quote on p. 34.

Inside The Brutal History Of Indigenous Residential Schools In Canada

To start with, there is far too much propaganda serving finacial interests.  The residential school system was modeled after the private boarding school systdemin england which was rife with imposed homosexual behavior amoung the boys themselves.  think Eton.  Obviously any form of boarding school system was going to have this type of leakage.

Yet the real death rate was not exceptional in a world in which many died from pneumonia.  Again the trash writers are taking the big number covering a century and drumming it into a plan.  Children died surviving the winter and certainly died out in the hunting camps.

The hard truth is that distributed hunting camp children needed to become educated like their White peers who were all hustled into schools of some sort over the same time period.  This program succeeded at the expense of wrecking traditional cultures and separating children from their families.

Those illiterate citizens would have died off or been dragged into farm labor when the fur trade basically died down to below a living wage.  

Today we have around 7,000,000 native american population in the USA down from and estimated 60,000,000 at contact in 1492.  We have 1,700,000 population in Canada and at best 2,000,000 at contact.  (no agriculture )  In fact the fur trade actually subsidized the Indian population into the Boreal forest and certainly preserved it. I do think that the USA reduced populations through slavery and thus forced intermarriage. They had two good centuries to do just that with Spanish assistance.

This is no apologia for how awful it could be.  Yet the foster children system continues to screw up.  Same old story.  No one says a word because they lack a better solution.  Today all first nations do speak english  and are intermarrying into the larger culture and that was the intent.  They are also working to preserve their past as well.

Inside The Brutal History Of Indigenous Residential Schools In Canada

By Kaleena Fraga | Checked By Jaclyn Anglis
Published October 11, 2021

From 1883 to 1996, nearly 150,000 Indigenous children were forcibly taken away from their parents and sent to Canadian residential schools where they faced horrific abuse.

Library and Archives Canada/FlickrChildren at an Indian Residential School in Ft. Simmons hold letters that spell out “Goodbye.”

For over a century, Canada held a dark, open secret. All across the country, officials forcibly took nearly 150,000 Indigenous children away from their parents and sent them off to abusive “residential schools.”

These schools, which operated from 1883 to 1996, banned students from speaking their native languages or practicing their cultural beliefs. Many of these students faced systemic abuse on a regular basis. Even worse, some children mysteriously vanished on the school grounds.

Thousands of kids — some estimates range from 10,000 to 50,000 — simply never came home. Though some ran away, thousands more died at the schools. Today, as their remains are slowly recovered from school grounds across the country, Indigenous leaders are demanding answers.

Those answers remain elusive. But they also represent the tragic end of a 100-year-old story — which is finally seeing the light of day.

The Creation Of Residential Schools In Canada

By the time European settlers began arriving in Canada en masse in the 16th century, Indigenous people had already lived there for thousands of years. At first, the settlers and the Indigenous people tried to coexist peacefully. In 1701, they agreed to share the territory like “a dish with two spoons.”

But the peace didn’t last. By the 19th century, settlers had begun to demand more access to land across Canada — land that belonged to the Indigenous people. And many of these settlers ascribed to the British Empire’s belief that they had a duty to “civilize” Indigenous people.

In 1857, the Gradual Civilization Act mandated that Indigenous men learn English and French. The Act also demanded that they disavow their traditional names and adopt government-approved names instead.

By 1883, Canada opted to go one step further. The government decided to use schools as a way to assimilate Indigenous children at an early age.

In order for these Indian Residential Schools to be successful, argued Sir John A. Macdonald, the first prime minister of Canada, the Indigenous children must be removed from their parents.

“When the school is on the reserve, the child lives with its parents, who are savages,” Macdonald declared in 1883, “and though he may learn to read and write, his habits and training and mode of thought are Indian.”

Indigenous children, insisted Macdonald, must be taken “from the parental influence.” He said that they should spend their childhoods in schools “where they will acquire the habits and modes of thought of White men.”

Library and Archives Canada/FlickrNuns with Indigenous children in Port Harrison, Quebec. Circa 1890.

Before long, about 150 schools — run by Catholic, Anglican, United, and Presbyterian churches — opened across Canada in partnership with the federal government. But they sought to do more than simply educate the kids. The Indian Residential Schools in Canada aimed to eliminate the children’s Indigenous knowledge and identity altogether.

“I want to get rid of the Indian problem,” stated Duncan Campbell Scott, the former deputy minister of Indian Affairs, in 1920.

Scott continued, “I do not think, as a matter of fact, that the country ought to continuously protect a class of people who are unable to stand alone. Our objective is to continue until there is not a single Indian in Canada that has not been absorbed into the body politic.”

But Indigenous people in Canada had no choice in the matter at all. Whether they were members of the First Nations, the Inuit, or the Métis communities, government officials just showed up at their doors and took their children.

As one Inuit survivor named Piita Irniq explained: “I was forcibly removed, taken, kidnapped by a Roman Catholic priest and a government man in August of 1958 so that I could be taken, like all of my generation of Inuit, to go to a residential school. We were taken away from our parents.”

Sometimes — far too often — the children never came home.

Life For Indigenous Children At The Schools

Multiple generations of Indigenous children spent much of their childhoods at Indian Residential Schools in Canada. Those who survived the ordeal often describe a terrifying atmosphere of violence and abuse at the hands of priests, nuns, and other staff members at the schools.

“They made us believe we didn’t have souls,” recalled Florence Sparvier, who attended the Marieval Indian Residential School. Sparvier recalls suffering physical abuse as staff members attempted to scare her away from her Indigenous identity — and discourage her from practicing her culture.

“We learned,” she said. “They pounded it into us. And really, they were very mean. When I say pounding, I mean pounding.”

Others recall suffering sexual abuse at residential schools in Canada. John Jones, who attended the Alberni Residential School, remembers hearing about a male supervisor who gave out chocolate bars to students. When Jones went to get some, the man sexually abused him.

“I don’t know how long that lasted, but I know I threw the chocolate bar in the garbage,” said Jones. “I took baths three or four times a day to feel clean, and it didn’t help.”

Another survivor, Jack Kruger, who attended St. Eugene’s Mission residential school, remembers that his best friend killed himself after being sexually abused by a priest — when he was just six years old.

“When you’re a little boy, you couldn’t do nothing,” said Kruger, who spent three years at the school. “You couldn’t say nothing. The priests had so much damn power. It’s incredible.”

National Center for Truth and ReconciliationIndigenous children, nuns, and priests at the Kamloops Indian Residential School in 1937.

To make matters worse, Canadian authorities were aware that the schools had several problems. In 1907, Indian Affairs chief medical officer Peter Bryce visited 35 Indigenous schools in Canada and found that 25 percent of their students had died. At one of the schools, 69 percent had died.

The schools, Bryce noted in his report, were poorly constructed and had bad ventilation. Tuberculosis spread like wildfire. Meanwhile, other officials noted issues with overcrowding, faulty heating, and inadequate nutrition.

“We cried to have something good to eat before we sleep,” recalled Andrew Paul, a survivor of the Aklavik Roman Catholic Residential School. “A lot of the times the food we had was rancid, full of maggots, stink.”

Though some of the students died of diseases like tuberculosis, others simply vanished. Their parents never learned what happened to them, although sometimes Canadian authorities said they’d run away.

“Sometimes kids would not show up in classroom,” said Garry Gottfriedson, a survivor of Kamloops Indian Residential School. “They would disappear for the next day and we knew that they were gone, but we didn’t know where they were gone.”

But on a number of occasions, survivors witnessed death firsthand. Some victims were beaten so brutally that they died due to their injuries. And some survivors have testified to seeing babies — born to young female students who had been raped by priests — deliberately killed.

Despite horrific stories like these, the schools operated for more than 100 years. The last Indian Residential School in Canada didn’t close until 1996.
The Ongoing Search For Answers

More than a decade after Canada’s last residential school closed, the government finally began to reevaluate the schools’ place in Canadian history. In 2008, the Canadian government offered Indigenous people a formal apology. And in 2015, Canada’s Truth and Reconciliation Commission determined that the schools were guilty of “cultural genocide.”

The Commission report also named some 3,200 students who had died while they were at the residential schools. But many Indigenous leaders believe that the number could be much higher — possibly in the tens of thousands. And in recent years, they’ve gone to find proof for themselves.

In 2021, members of the Tk’emlúps te Secwépemc Nation swept the grounds of Kamloops Indian Residential School with ground-penetrating radar. Tragically, they found 215 small bodies on the grounds.

And just a few weeks later, members of the Cowessess First Nation used ground-penetrating radar to uncover as many as 751 childrens’ bodies at the since-demolished Marieval Indian Residential School.

Cole Burston/AFP via Getty ImagesA shrine for children at Kamloops Indian Residential School, where tribal members found 215 bodies in 2021.

“We had a knowing in our community that we were able to verify,” Tk’emlúps te Secwépemc Chief Rosanne Casimir said. “At this time, we have more questions than answers.”

Murray Sinclair, a member of the Peguis First Nation who led the Truth and Reconciliation Commission, agrees.

“We need to know who died, we need to know how they died, we need to know who was responsible for their deaths or for their care at the time that they died,” said Sinclair. “We need to know why the families weren’t informed. And we need to know where the children are buried.”

In the end, finding the answers to these questions remains the goal of countless Indigenous people across Canada. For 100 years, the Indian Residential Schools took their children. Now, they want to bring them home.