An interesting experiment is to ask exactly the same hypothetical questions of two vocational experts, on the same day, at the same hearing office. You may be amazed to learn how quickly jobs are created and destroyed in the imaginary world of the vocational expert. It would be fun, even silly, if that testimony were not relied upon by SSA to deny (or award) disability benefits. It’s a world of make-believe numbers with dramatic consequences. So, what do you do?
The vast majority of attorneys and representatives respond to vocational “expert” (VE) testimony at SSA’s disability hearings by working to drive down the quantity of jobs given. They ask the VE about additional conditions in hypothetical questions after the Administrative Law Judge has finished her questions. While this certainly is something that the good attorney will strive to do, it is only a small part of the job. It also is useless if the ALJ is determined to turn down the case, and has set the stage for a denial.
The other part of the attorney’s job includes asking the VE “how do you know?” Of course, the average VE will reply “due to my 20 years of experience.”
That is not a sufficient answer to explain the testimony that there are 22,235 unskilled sedentary security monitors. We want to know “how do you know?” From that, the usual VE will say, “it says so right on this report!” Which report? Sometimes it is from U.S. Publishing (which is not a government source) or SkillTran (which is also not a government source.)
So, we ask, “how did they know?” “Show us how that company got to that number, show us that it is reliable, and tell us how you used that number to answer the ALJ’s questions.”
What it all comes down to in the VE’s “numbers game” is reproducibility. One way to sum it up for the evasive VE and the ALJ who constantly interrupts, wanting to know what you are getting at is, “Assume, Ms. VE, that my client, and a reviewing District Court do not believe you. Please explain the methodology you used that allowed you to answer a hypothetical question (of which you had no prior knowledge) in 15 seconds, that gave an answer of 22,235 unskilled sedentary security monitors.” That is, “give us the ability to test what you have just said, just as you said it, to ensure that it is correct rather than a number you pulled from a report or computer program that you do not understand.”
Researchers from Smith College, Duke University complains of a similar the problem in such allegedly scientific information in another way:
"The ability to duplicate an experiment and its results is a central tenet of the scientific method, but recent research has shown an alarming number of peer-reviewed papers are irreproducible.
A team of math and statistics professors has proposed a way to address one root of that problem by teaching reproducibility to aspiring scientists, using software that makes the concept feel logical rather than cumbersome.
Researchers from Smith College, Duke University and Amherst College looked at how introductory statistics students responded to a curriculum modified to stress reproducibility. Their work is detailed in a paper published Feb. 25 in the journal Technological Innovations in Statistics Education.
In 2013, on the heels of several retraction scandals and studies showing reproducibility rates as low as 10 percent for peer-reviewed articles, the prominent scientific journal Nature dedicated a special issue to the concerns over irreproducibility.
Nature's editors announced measures to address the problem in its own pages, and encouraged the science community and funders to direct their attention to better training of young scientists.
"Too few biologists receive adequate training in statistics and other quantitative aspects of their subject," the editors wrote. "Mentoring of young scientists on matters of rigour and transparency is inconsistent at best."
The authors of the present study thus looked to their own classrooms for ways to incorporate the idea of reproducibility.
"Reproducing a scientific study usually has two components: reproducing the experiment, and reproducing the analysis," said Ben Baumer, visiting assistant professor of math and statistics at Smith College. "As statistics instructors, we wanted to emphasize the latter to our students."
The grade school maxim to "show your work" doesn't hold in the average introductory statistics class, said Mine Cetinkaya-Rundel, assistant professor of the practice in the Duke statistics department. In a typical workflow, a college-level statistics student will perform data analysis in one software package, but transfer the results into something better suited to presentation, like Microsoft Word or Microsoft PowerPoint.
Though standard, this workflow divorces the raw data and analysis from the final results, making it difficult for students to retrace their steps. The process can give rise to errors, and in many cases, the authors write, "the copy-and-paste paradigm enables, and even encourages, selective reporting. . . "
To Teach Scientific Reproducibility, Start Young | Duke Today