Taking Advantage of 2L Course Options
Kimberly Crowley (’20) offers advice on course selection on the BU Law JD Student Blog.
I cannot vouch for all other 2L students, but for me, choosing classes for my second year was a little bit anxiety-inducing. Like many other people, the only other time I’ve had to choose my classes was in undergrad, when the requirements for a major were nicely handed to me by some lovely academic advisor, who helped me schedule when best to take them and with which professors. I mean, in certain ways, law school is not so different. Your academic advisor sits you down, lets you know what will be on the bar. They help you prioritize and understand your options. The main difference is… there are so many options! The requirements to graduate law school are much more vague than undergrad, and it feels like the legal world is your oyster. Doctrinal classes to prep for the bar, seminar courses in interesting topics, externships, study abroad opportunities, semesters in practice… I sat there wondering, “I can’t do all of this… so how do I possibly choose?”
In all honesty, I’m still working this out. I have changed my second semester schedule approximately five times in the last two weeks. The one piece of advice I can give to a rising 2L is one that I was afraid to take myself: Enroll in a class outside your wheelhouse, just because you’re interested in it.
For me last semester, that class was Privacy, Security, and Technology with phenom new professor and my academic advisor, Professor Ahmed Ghappour. I took that seminar for no other reason than I looked at the title and thought “well, that sounds interesting.” Thankfully, the law firms I interviewed with this summer validated that choice, with one attorney after another praising the decision and letting me know how important it is for attorneys nowadays to understand upcoming technology and its privacy implications.
However, this was my background on technology going in to the class: I have a Facebook. I saw The Social Network. I have filled up my iCloud storage and don’t know how to fix that. I struggled to set up Alexa…. I could keep going, but you get the idea. Now cut to a few (real, 100 percent not made up) comments I made this seminar on such a sophisticated and topical area of law:
“Fun Fact: 50 Shades of Grey was a fanfiction.”
“Wait what’s the torr network?”
“Hey, Jon Oliver talked about Net Neutrality!”
“We should watch Hasan Minaj’s Patriot Act episode about Free Speech and Facebook.”
Yes, as you can see, I felt like a moron for two hours a week, every week. But… the class taught me an important lesson—namely, that there is a major upside to feeling like a moron for two hours a week. This feeling means that you have so much you can learn in those two hours, and that, so long as you’re interested in the topic, each and every minute will be interesting.
By the end of the seminar, I learned enough and enjoyed enough to write a paper about forensic evidence and the criminal justice system. Did you not hear about it? I’m surprised. You must not be in my network of friends and family, who were forced to hear about it every minute of every day over Christmas break, as I could not stop talking about every fascinating detail of what I had learned in writing the paper. Don’t believe me? Here’s an excerpt. Although before I post it…. I’m going to bring this post back to the point. Take something out of your wheelhouse; be okay with feeling dumb; and let the learning commence!
— An Excerpt From A Paper About A Topic I Knew Nothing About 3 Months Ago —
A quick scroll through Netflix reveals American society’s fascination with forensic science. From Dexter’s blood-spatter patterns to NCIS’s bite mark comparisons to Law and Order: Special Victims Unit’s microscopic hair analysis, many of Netflix’s most popular shows center around seemingly remarkable scientists catching the bad guys using new technology. Yet, behind the Netflix scenes but in the public eye, some of America’s premier scientific institutions have been telling a different story. Forensic disciplines have increasingly come under scrutiny in the last decade as insufficiently tested and, in many cases, even invalid. In fact, two national committees have released reports within the last ten years detailing extensive flaws in many forensic science techniques. The first of these occurred in 2009 when the National Research Council for the National Academy of Sciences reported that, other than DNA analysis, “no forensic method has been rigorously shown to have the capacity to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specified individual source.”[1] Specifically, the NRC report implicated fingerprint examinations,[2] firearm (ballistics) and toolmark identifications,[3] handwriting examinations,[4] microscopic hair analysis,[5] and bite mark comparisons.[6]
In response to these findings, President Obama announced the President’s Council of Advisors on Science and Technology (PCAST) shortly after the NRC released its report. He then tasked it in 2015 with determining “whether there are additional steps on the scientific side, beyond those already taken by the Administration in the aftermath of a highly critical [NRC Report] on the state of the forensic sciences, that could help ensure the validity of forensic science.”[7] When PCAST issued its report in 2016,[8] it too found shortcomings in “virtually all aspects of [forensic] evidence, from foundation through application.”[9] Thus, for the past two years, behind the quick scroll through Netflix has lurked some startling realities — namely, the uncertainties associated with Dexter’s blood-spatter patterns are “enormous”[10]; appropriately designed validation studies are lacking for NCIS’s bitemark analysis, with experts unable to agree on whether an injury is a human bite mark[11]; and, a recent FBI review concluded that, of twenty-eight FBI agents who conducted microscopic hair analyses, twenty-six made erroneous statements in written reports or oral testimony in court.[12]
[1] Nat’l Research Council, Strengthening Forensic Science in the United States: A Path Forward 14 (2009).
[2] Id. at 144 (stating research is needed to validate fingerprint identification).
[3] Id. at 154 (“Sufficient studies [on firearms identification] have not been doing to understand the reliability and repeatability of the methods.”).
[4] Id. at 166 (“The scientific basis for handwriting comparisons needs to be strengthened.”).
[5] Id. at 161 (“An FBI study found that, of 80 hair comparisons that were ‘associated’ through microscopic examinations, 9 of them (12.5 percent) were found in fact to come from different sources when reexamined through mtDNA analysis.”).
[6] Id. at 174 (“Even when using the guidelines, different experts provide widely differing results and a high percentage of false positive matches of bite marks using controlled comparison studies.”).
[7] Eric Lander et al., PCAST Releases Report on Forensic Science in Criminal Courts, White House (Sept. 20, 2016, 5:59 AM).
[8] See generally President’s Council of Advisors on Sci. & Tech., Forensic Science in Criminal Courts: Ensuring Scientific Validity of Feature-Comparison Methods (2016) (finding flaws with DNA analysis of complex-mixture samples, latent fingerprint analysis, firearms analysis, hair analysis, footwear analysis, and bite-mark analysis).
[9] Jane Campbell Moriarty, Deceptively Simple: Framing, Intuition, and Judicial Gatekeeping of Forensic Feature-Comparison Methods Evidence, 86 Fordham L. Rev. 1687, 1689 (2018).
[10] Nat’l Research Council, supra note 1, at 179.
[11] Paul C. Giannelli, Forensic Science: Daubert’s Failure, 68 Case W. Res. L. Rev. 869, 881-82 (2018).
[12] Press Release, FBI, FBI Testimony on Microscopic Hair Analysis Contained Errors in at Least 90 Percent of Cases in Ongoing Review (Apr. 20, 2015).
Reported by Kimberly Crowley (’20), originally posted on the BU Law JD Student Blog.