Heavy Dragonplate or Extra Thin Tissue Paper?

The catch-all forum for general topics and debates. Minimal moderation. Rated PG to PG-13.
Post Reply
User avatar
Physics Guy
God
Posts: 1968
Joined: Tue Oct 27, 2020 7:40 am
Location: on the battlefield of life

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by Physics Guy »

There are multiple terms for soft drinks in current American English. In some southern districts the generic term for a soft drink of any kind is "a coke". One can be asked in a restaurant which kind of coke one would like to drink. One could answer "Orange Crush" or "Root Beer".

There are also multiple competing terms for those sandwiches on long, thin loaves of bread. Some places call them "submarine sandwiches" or "subs". Others call them "hero sandwiches". Still others know them as "hoagies".

Some years ago relatives of mine were on a road trip through an unfamiliar dialectical region. To the guy in the takeout window, the question "What is a hoagie?" was like asking, "What is fire?" He didn't know what to say.
I was a teenager before it was cool.
User avatar
Dr Moore
Endowed Chair of Historical Innovation
Posts: 1889
Joined: Mon Oct 26, 2020 2:16 pm
Location: Cassius University

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by Dr Moore »

Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
Kyler assumes that the count of archaic definitions follows a Poisson distribution with a parameter of lambda equal to “at most” 1 per book.

The problems with this include:

1- He is assuming a priori that the “archaic meanings” really are archaic and with much more than a gazillion-to-one certainty couldn’t have existed in Joseph Smith’s time and place. Doing this isn’t adding mathematical rigor to Carmack’s arguments. It is assuming that Carmack is right.

2- The probability distribution is made up and the parameterization is made up. There is literally zero justification for any of this.
I would argue there is one and only one justification for a Poisson analysis making an appearance in Kyler's work on Book of Mormon archaisms. He's showing off; it's a transparent attempt to "confuse his audience" for the win.

There is an old executive leadership saying about hot-shot employees who try too hard to impress leadership with fancy analyses and needless complexity. They're "mistaking confusion for clarity."

If you're not committed to forcing the Book of Mormon to be ancient, there is a very clear and straightforward explanation for the discovery of 15th/16th century archaic English language patterns in Joseph's dictated Book of Mormon project. On that point, here's another classic boardroom proverb: "if you're still explaining it, you've already lost."
Dr Exiled
God
Posts: 2107
Joined: Tue Oct 27, 2020 2:40 pm

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by Dr Exiled »

Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
kyzabee wrote:
Sat Sep 04, 2021 7:40 am
Also, looks like Billy never addressed the way I handled the extinct Early Modern English semantics. I'm sure he'll get around to it, though.
Thank you for your patience, Kyler. Here is how I see it.

Background
Adventures of Huckleberry Finn
begins with this explanatory from Mark Twain:

In this book a number of dialects are used, to wit: the Missouri negro dialect; the extremest form of the backwoods Southwestern dialect; the ordinary “Pike County” dialect; and four modified varieties of this last. The shadings have not been done in a haphazard fashion, or by guesswork; but painstakingly, and with the trustworthy guidance and support of personal familiarity with these several forms of speech.

I make this explanation for the reason that without it many readers would suppose that all these characters were trying to talk alike and not succeeding.


The reason I bring this up is because Huckleberry Finn illustrates ordinary people speak English differently than intellects write English, and these dialects can and do vary dramatically from location to location. Mark Twain identified new fewer than five dialects just in Pike County, Missouri. In the early 1800’s, dialects were probably stronger and more varied than they are today.

Furthermore, with exceedingly few exceptions like Huckleberry Finn, we don't have records of how people actually spoke in first half of the 1800's.

How to Approach the Semantic Issue
Apparently, Carmack has identified 26 instances of alleged “archaic word meanings.” I haven’t seen the list of the specific hypothesized word meanings. But in general, I would say the hypotheses that would explain this fall into the following broad categories:

1- The archaic definition is an example of a remnant of Early Modern English that survived into Joseph Smith’s dialect.

2- The writer really intended something different than the Early Modern English definition, but the Early Modern English definition seeming to fit is merely a coincidence that looks more impressive than it is because of Carmack’s and Skousen’s data mining.

3- Scattered within the 800 page book, Joseph dictated a couple dozen words that really were intended to have Early Modern English definitions that nobody properly understood until Carmack and Skousen did this research.

I would presume that 100% of independent linguists would think #1 and #2 are infinitely more plausible than #3. The objective of a proper analysis would be to prove that the incidences we have are not a result of #1 or #2. This seems especially difficult because we don’t have a solid understanding of the Wayne County NY dialect, the Windsor County VT dialect, or whatever dialect the author of the Book of Mormon actually spoke.

A valid statistical approach requires that the Book of Mormon be compared with books dictated by people with limited education who were from Joseph Smith's area who were trying to sound Biblical, not books that were written by intellects from across the English-speaking world.

Finally, a valid statistical approach would note that semantics and syntax are both driven by dialect. These issues are not statistically independent as Kyler assumes.

Kyler’s Approach
Kyler assumes that the count of archaic definitions follows a Poisson distribution with a parameter of lambda equal to “at most” 1 per book.

The problems with this include:

1- He is assuming a priori that the “archaic meanings” really are archaic and with much more than a gazillion-to-one certainty couldn’t have existed in Joseph Smith’s time and place. Doing this isn’t adding mathematical rigor to Carmack’s arguments. It is assuming that Carmack is right.

2- The probability distribution is made up and the parameterization is made up. There is literally zero justification for any of this.

To illustrate, Kyler linked the following wikipedia definition,

In probability theory and statistics, the Poisson distribution (/ˈpwɑːsɒn/; French pronunciation: ​[pwasɔ̃]), named after French mathematician Denis Poisson, is a discrete probability distribution that expresses the probability of a given number of events occurring in a fixed interval of time or space if these events occur with a known constant mean rate and independently of the time since the last event.

How could we possibly claim that the probability of an archaic word meaning occurring over the fixed interval of "a book" has a known constant rate of one-per-book, and that the chances of this happening are independent of the time since the last apparent archaic meaning?

From the perspective of Bayesian reasoning, he's saying the following with his actual math. First, let X be a random variable that is the number of archaic definitions in the book.

Under the "it's true" hypothesis, X is distributed with a discrete distribution where X = 13 with a probability p = 1.00.

Under the "it's made up" hypothesis, X is distributed with a Poisson distribution with lambda = 1.00.

(And remember. These numbers were chosen "for the benefit of the critics, of course.")

The experiment is then run. Low and behold! We objectively measure exactly 13 archaic definitions! What an amazing bullseye!
Thank you for your work on this. Early Modern English always bothered me because it is so obviously a reach due to motivated reasoning that I still can't believe how DCP allows it to persist.
1- The archaic definition is an example of a remnant of Early Modern English that survived into Joseph Smith’s dialect.

2- The writer really intended something different than the Early Modern English definition, but the Early Modern English definition seeming to fit is merely a coincidence that looks more impressive than it is because of Carmack’s and Skousen’s data mining.
Glaringly, we don't have any evidence of how Joseph Smith spoke and how those around him spoke. The late Clark Goble brought this up time and again on the MD&D board to Carmack and Carmack would never engage (for good reason). There isn't any evidence for Nephites and so we must default to the Book of Mormon being Bible fan fiction. The rock and hat magic show reaffirms this. So, why, if it is even there, is there Early Modern English in the Book of Mormon? A back-woods want to be religionist who wants to justify his scrying dictating his Bible fan fiction, trying to sound like he would think an old native american jewish prophet would sound, seems to be the best theory to me.

Then, your pointing out this below is simply unacceptable and DCP should be ashamed at publishing this nonsense.
2- The probability distribution is made up and the parameterization is made up. There is literally zero justification for any of this.
Myth is misused by the powerful to subjugate the masses all too often.
User avatar
Physics Guy
God
Posts: 1968
Joined: Tue Oct 27, 2020 7:40 am
Location: on the battlefield of life

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by Physics Guy »

I think I clarified the Bayesian form of the Sharpshooter Fallacy a while back, at least for me. I produce a long string of random characters, and then declare it as a foundational doctrine of my new religion that this precise string of characters is a vitally important new message from God to humanity.

The Sharpshooter Fallacy in Bayesian form then goes as follows.
1) On the hypothesis that I am not a prophet, the probability of my posting that particular string of characters is 1 in [as large a number as I want, if I just included enough random characters in the string].
2) On the hypothesis that I am the true prophet of this new religion, the probability of my posting that particular string of characters is 1, because it's a necessary implication of this hypothesis that exactly that string is the vital message from God.
3) One line of Bayesian arithmetic: Bingo! It's as overwhelmingly likely as I want to make it, by making a long enough string, that I am a true prophet. No matter how low you make the prior probability of my prophethood, I can pump that probability up to near one with a long enough string.

Let's say that the chance of that string as a random guess is 1 in 10^34, while the prior chance that I am a prophet is 1 in 10^10. So the Bayesian Sharpshooter Fallacy puts my chance of not being a prophet, given the string, at 1 in 10^24.

So where is the fallacy? When I declared in hindsight that the probability of this long random string had to be one, given that I was a true prophet, this was actually a bait-and-switch move that swept an arbitrarily enormous probability ratio under the rug.

Is the hypothesis simply that I am a true prophet with a true message from God? Very well, the prior probability of that is 1 in 10^10. But given only that rather general hypothesis, and no further detail, the probability that God's message through me should be exactly that long random string by no means has to be one. God could reveal all kinds of things. That random string was just weird. The likelihood that God would reveal that specific weird string is about as low as the probability that it would occur randomly, 1 in 10^34. So then the bottom line Bayesian chance that I am a true prophet is still 1 in 10^10.

Or is the hypothesis rather the more specific statement that I am a true prophet AND that God's message to us is that specific long string? Very well, in that case the probability of the string, given the hypothesis, is indeed one. But now the prior probability of this hypothesis cannot be the same as the prior probability of the much weaker hypothesis that I was a prophet but God's message could be anything. A reasonable prior probability for this far stronger hypothesis is 1 in 10^10 TIMES 1 in 10^34, or 1 in 10^44. Again the correct Bayesian bottom line for my prophet probability, given the data, is the same prior value of 1 in 10^10.

The fallacy consisted of using the P(data|hypothesis) that was appropriate for the strong version of the hypothesis together with the P(hypothesis) prior that was appropriate for the weak version of the hypothesis. This fallacious bait-and-switch conceals a probability factor which is exactly equal to the very low default probability of the observed data. This is exactly the trick.

The numbers I used in this example were made up for illustration, but the fact that the spurious factor which the trick introduces is as enormous as 10^34 is not an artificial exaggeration on my part. It's very easy to produce Sharpshooter Fallacy Bayesian factors that are extremely large, by just collecting a lot of random data and then using this bait-and-switch move with the hypothesis.

One can very easily even appear to be being extravagantly generous to critics, conceding impressively tiny prior probabilities—that is, prior probabilities that would be conservatively low for the weak version of the hypothesis. The huge factors from large sets of data, that are smuggled in by the bait-and-switch from weak to strong in the hypothesis, can easily overwhelm any apparent concession with the priors.

It's a trivial and obvious trick once you see it, but like many such tricks it can be baffling until you notice the gimmick.
I was a teenager before it was cool.
kyzabee
Sunbeam
Posts: 51
Joined: Sat Jul 24, 2021 2:51 am

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by kyzabee »

Thanks for taking the time to think this through, Billy, as always. And thanks for the opportunity to respond.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
The reason I bring this up is because Huckleberry Finn illustrates ordinary people speak English differently than intellects write English, and these dialects can and do vary dramatically from location to location. Mark Twain identified new fewer than five dialects just in Pike County, Missouri. In the early 1800’s, dialects were probably stronger and more varied than they are today.


And if any of these varied dialects just happened to match the syntax and semantics of Early Modern English, you'd think we could find some evidence of that in a database as broad and deep as NCCO.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
Furthermore, with exceedingly few exceptions like Huckleberry Finn, we don't have records of how people actually spoke in first half of the 1800's.


I'm afraid you'll have to demonstrate to me that the spoken syntax of the 19th century differs so markedly from anything recorded in the textual record that it matching archaic syntax is a real possibility. Because to me that sounds an awful lot like one of Huck Finn's tall tales.

And it turns out the textual record gives us far more insight into spoken language than you're purporting here:

https://www.degruyter.com/document/doi/ ... -0002/html
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
1- The archaic definition is an example of a remnant of Early Modern English that survived into Joseph Smith’s dialect.


If it survived in Joseph's dialect, we should've expected it to survive at least somewhere else in the 19th century textual record.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
2- The writer really intended something different than the Early Modern English definition, but the Early Modern English definition seeming to fit is merely a coincidence that looks more impressive than it is because of Carmack’s and Skousen’s data mining.


Again, if so, these sort of coincidences should be relatively easy to find elsewhere in the textual record. Carmack's data mining cuts both ways on this one.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
3- Scattered within the 800 page book, Joseph dictated a couple dozen words that really were intended to have Early Modern English definitions that nobody properly understood until Carmack and Skousen did this research.


Having taken a hard look at all of Carmack and Skousen's examples, I do think there are some cases where the meaning is ambiguous enough that an Early Modern English definition may not have been intended. But by the time you get through all 26 it's very hard to make that case for all or even most of them. This is part of why I cut the number of valid cases in half.

It's also important to note that there are many, many other examples beyond the 26 where the meaning doesn't appear to be extinct, but was still exceptionally rare--rare enough that Carmack couldn't locate them until he had the full might of NCCO at his disposal. Again, the naturalistic hypothesis would be that all of these remained in Joseph's dialect via fluke, and the idea that each of those would be a rare event would probably still apply.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
This seems especially difficult because we don’t have a solid understanding of the Wayne County NY dialect, the Windsor County VT dialect, or whatever dialect the author of the Book of Mormon actually spoke.


Again, for me to buy this you'd need to demonstrate that Joseph's spoken dialect was or could've been different enough from his written dialect for the vast differences in syntax and semantics to arise. Otherwise my best guess for how Joseph spoke would have to be how he wrote, and to posit otherwise would be to depart from the evidence at hand.

And in Joseph's case, we have numerous shorthand transcripts of his sermons. You'd think that such would be valid indications of his spoken dialect, and yet these don't appear to reflect anything resembling Early Modern English.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
1- He is assuming a priori that the “archaic meanings” really are archaic and with much more than a gazillion-to-one certainty couldn’t have existed in Joseph Smith’s time and place. Doing this isn’t adding mathematical rigor to Carmack’s arguments. It is assuming that Carmack is right.


Actually, I'm not. I'm assuming that it would be difficult (though far from impossible) to find comparable anachronistic semantic examples in other nineteenth century books. As I note in the essay, this would mean finding words with meanings that weren't used by any other book written in the author's century, but that just happened to match those used within Early Modern English. I have a very hard time imagining finding enough such examples for the Book of Mormon to not represent a massive outlier, though I'd be happy to be proven wrong.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
2- The probability distribution is made up and the parameterization is made up. There is literally zero justification for any of this.
And do you have ones that you think would work better? Can you back those up any better than I can? Am I crazy for thinking that these semantic examples should be quite rare?
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
How could we possibly claim that the probability of an archaic word meaning occurring over the fixed interval of "a book" has a known constant rate of one-per-book, and that the chances of this happening are independent of the time since the last apparent archaic meaning?
How could we possibly claim such a thing for suicide rates or any of the other natural events that follow a Poisson distribution? Just because I can't prove these assumptions with exactness doesn't mean that this isn't a valid way to model rare events.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
(And remember. These numbers were chosen "for the benefit of the critics, of course.")
They were. It would absolutely be valid to use all 26 apparently extinct lexis, and for me to posit a much smaller lambda.

When I characterized the Early Modern English evidence as heavy dragonplate, it was in the context of proposing chinks in that armor. And I do think there are some. They would just take quite a lot of work to properly and persuasively document. If my analysis is mistaken, we should be able to look in the textual record and show just how mistaken I am. We should be able to find dialectical syntax matching Early Modern English in 19th century rural areas. We should be able to look through 19th century texts and find examples of apparently extinct lexis. But hanging one's hat on these (thin) possibilities feels an awful lot like faith. As it stands, Early Modern English in the Book of Mormon is extremely unexpected from a naturalistic perspective, and my analysis is a reflection of that current state.
dastardly stem
God
Posts: 2259
Joined: Tue Nov 03, 2020 2:38 pm

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by dastardly stem »

kyzabee wrote: Again, for me to buy this you'd need to demonstrate that Joseph's spoken dialect was or could've been different enough from his written dialect for the vast differences in syntax and semantics to arise. Otherwise my best guess for how Joseph spoke would have to be how he wrote, and to posit otherwise would be to depart from the evidence at hand.

And in Joseph's case, we have numerous shorthand transcripts of his sermons. You'd think that such would be valid indications of his spoken dialect, and yet these don't appear to reflect anything resembling Early Modern English.
kyzabee wrote: As it stands, Early Modern English in the Book of Mormon is extremely unexpected from a naturalistic perspective, and my analysis is a reflection of that current state.
I hate to try and sound like a voice of reason here, but a naturalistic perspective would not require Joseph wrote the book. Not at all. All that needs to be shown for the presence of Early Modern English in the book is there are published works available in Joseph's day. And since we have published works from before Joseph's day showing Early Modern English, there is no reason to suggest it is extremely unexpected from a naturalistic perspective. Those Early Modern English books exist today so they likely existed in Joseph's day. Whether Joseph scoured any to pick up weird syntax is immaterial.

And more to the point, is this analysis really an effort to determine whether something supernatural is possible, or plausible? I thought the focus was to determine the probability of the book written anciently vs it written in the 19th Century? If so, the presence of Early Modern English doesn't address the issue, it seems to me. It's just as probable the Book of Mormon was written anciently as in the 19th Century considering, or given, the presence of Early Modern English.

I should note...I'm D6 on Sic et something or other.
“Every one of us is, in the cosmic perspective, precious. If a human disagrees with you, let him live. In a hundred billion galaxies, you will not find another.”
― Carl Sagan, Cosmos
Lem
God
Posts: 2456
Joined: Tue Oct 27, 2020 12:46 am

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by Lem »

Physics Guy wrote:
Tue Sep 07, 2021 7:24 am
I think I clarified the Bayesian form of the Sharpshooter Fallacy a while back, at least for me...

The fallacy consisted of using the P(data|hypothesis) that was appropriate for the strong version of the hypothesis together with the P(hypothesis) prior that was appropriate for the weak version of the hypothesis. This fallacious bait-and-switch conceals a probability factor which is exactly equal to the very low default probability of the observed data. This is exactly the trick....

One can very easily even appear to be being extravagantly generous to critics, conceding impressively tiny prior probabilities—that is, prior probabilities that would be conservatively low for the weak version of the hypothesis. The huge factors from large sets of data, that are smuggled in by the bait-and-switch from weak to strong in the hypothesis, can easily overwhelm any apparent concession with the priors.

It's a trivial and obvious trick once you see it, but like many such tricks it can be baffling until you notice the gimmick.
This is a huge point, I'm glad you brought it up again, PG. It deserves to be addressed every time it's used.

And as PG says above, its a gimmick. Looking at Kyler's calculations, you can see this perfectly when, after his calculations, he determines the order of magnitude change by using the ratio CH/CA.

In episodes 3, 5, 7, and 9, Kyle uses probability = 1 for the CH, and a very small number for CA.

CH is the probability that his hypothesis favoring antiquity is true given the evidence. This is simply assumed by Kyle, without any supporting data.

CA is how likely the alternate hypothesis is, given the evidence. This is also assumed, with some justification but no supporting data.

So, 1 divided by a very, very small number will always result in a large number and a positive magnitude shift, supporting the Book of Mormon antiquity.

The smaller the denominator, the larger the magnitude shift.

So, in spite of pages and pages of justification, the change in magnitude in support of antiquity ultimately comes down to two ASSUMED probabilities. Probability assumptions that are unrealistic and unjustifiably high (P=1) in favor of antiquity, and probability assumptions that are equally made up and unjustifiably low against it.

In fact, any probability could be used for the denominator, and as long as 1 is assumed for the numerator, the magnitude shift will favor Book of Mormon antiquity.

Like PG said, it's a gimmick. And it's very bad statistical analysis.
drumdude
God
Posts: 7208
Joined: Thu Oct 29, 2020 5:29 am

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by drumdude »

dastardly stem wrote:
Tue Sep 07, 2021 2:44 pm
.
I should note...I'm D6 on Sic et something or other.
I’m a big fan of both you and defame the everything
Billy Shears
Sunbeam
Posts: 55
Joined: Fri Jul 23, 2021 8:13 pm

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by Billy Shears »

kyzabee wrote:
Tue Sep 07, 2021 2:01 pm
Thanks for taking the time to think this through, Billy, as always. And thanks for the opportunity to respond.
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
The reason I bring this up is because Huckleberry Finn illustrates ordinary people speak English differently than intellects write English, and these dialects can and do vary dramatically from location to location. Mark Twain identified new fewer than five dialects just in Pike County, Missouri. In the early 1800’s, dialects were probably stronger and more varied than they are today.

And if any of these varied dialects just happened to match the syntax and semantics of Early Modern English, you'd think we could find some evidence of that in a database as broad and deep as NCCO.
Which books in the NCCO were dictated by people from New England or upstate New York that were trying to sound biblical?
kyzabee wrote:
Tue Sep 07, 2021 2:01 pm
And it turns out the textual record gives us far more insight into spoken language than you're purporting here:

https://www.degruyter.com/document/doi/ ... -0002/html
Which 19th Century novels contain extensive text of people from New England or upstate New York who were trying to sound biblical?
kyzabee wrote:
Tue Sep 07, 2021 2:01 pm
If it survived in Joseph's dialect, we should've expected it to survive at least somewhere else in the 19th century textual record.
Not necessarily. People write differently than they speak. Furthermore, people who are trying to speak in Elizabethan English are going to grope for words and usage in a way that is different than people who are trying to sound modern and educated. Because of that, the relevant textual record is incredibly small.
kyzabee wrote:
Tue Sep 07, 2021 2:01 pm
It's also important to note that there are many, many other examples beyond the 26 where the meaning doesn't appear to be extinct, but was still exceptionally rare--rare enough that Carmack couldn't locate them until he had the full might of NCCO at his disposal. Again, the naturalistic hypothesis would be that all of these remained in Joseph's dialect via fluke, and the idea that each of those would be a rare event would probably still apply.
Note how your position is sliding here. First Carmack says Joseph Smith couldn't have used these definitions because they were extinct. He then expanded it to look at more books and found out many of them really weren't extinct after all. That proves my point: just because Carmack couldn't find a word usage in a database doesn't mean it was extinct from the language.
kyzabee wrote:
Tue Sep 07, 2021 2:01 pm
And in Joseph's case, we have numerous shorthand transcripts of his sermons. You'd think that such would be valid indications of his spoken dialect, and yet these don't appear to reflect anything resembling Early Modern English.
In which of his sermons was he trying to speak in Elizabethan English?
kyzabee wrote:
Tue Sep 07, 2021 2:01 pm
Actually, I'm not. I'm assuming that it would be difficult (though far from impossible) to find comparable anachronistic semantic examples in other nineteenth century books. As I note in the essay, this would mean finding words with meanings that weren't used by any other book written in the author's century, but that just happened to match those used within Early Modern English. I have a very hard time imagining finding enough such examples for the Book of Mormon to not represent a massive outlier, though I'd be happy to be proven wrong.
The Book of Mormon is a massive outlier because it was dictated by somebody from the Northeast who was trying to use Elizabethan English. Of course a book like that will be an outlier compared to other books.
kyzabee wrote:
Tue Sep 07, 2021 2:01 pm
Billy Shears wrote:
Mon Sep 06, 2021 2:35 pm
2- The probability distribution is made up and the parameterization is made up. There is literally zero justification for any of this.
And do you have ones that you think would work better? Can you back those up any better than I can? Am I crazy for thinking that these semantic examples should be quite rare?
Joseph Smith using these 26 words in these ways is prima facie evidence that while they may have been archaic, they were still in the lexicon of people who were trying to speak using an older version of the language.

For books that are trying to sound modern and educated, the count of archaic word meanings might very well be Poisson distributed with a lambda of 1-per-800 pages. But for a book that is trying to sound archaic, the frequency of archaic word meanings is probably closer to a Poisson distribution with a lambda of 26-per-800 pages.

But as your data proves, pseudo-biblical books are not homogenous with regards to their archaic characteristics--some reflect Elizabethan English much better than others. The lambda varies from author to author.

Whatever the correct lambda for the Book of Mormon is, it should be identical in the "it was translated" hypothesis and the "it was made up" hypothesis.
User avatar
Dr Moore
Endowed Chair of Historical Innovation
Posts: 1889
Joined: Mon Oct 26, 2020 2:16 pm
Location: Cassius University

Re: Heavy Dragonplate or Extra Thin Tissue Paper?

Post by Dr Moore »

kyzabee wrote:
Tue Sep 07, 2021 2:01 pm

They were. It would absolutely be valid to use all 26 apparently extinct lexis, and for me to posit a much smaller lambda.
There is no universe in which the presence of material from time (T) in a book published in time (T+1) is evidence that the book was actually composed at time (T-1). All you've accomplished with this episode is to highlight the distortion field necessary to entertain Early Modern English as anything but a fatal blow to the Book of Mormon's historicity claim.

As in many of your episodes, you're using "unusual" as a proxy for "ancient" and then plumbing for some statistical construct to demonstrate a dimension of "unusual" that can be quantified. Repeated use of non sequitur logic doesn't somehow average out to good logic! No matter how much you flourish the pieces with statistical showing-off.
Post Reply