FUN Newsletter, January 2017, Volume 4, Issue 1

FUN Newsletter, January 2017, Volume 4, Issue 1

 

===This is a full-text dump of the newsletter for search indexing==

Happy New Year all! As your, now, Past-President, I just wanted to take a minute and
say thank you to the FUN membership and the FUN Executive committee that helped
to support my time as President of our fabulous organization. The Exec group
provided indispensable counsel and support, and I appreciate them for their time and
efforts. I also want to thank the membership as a whole; we had many members step
forward and take on more active roles this year such as assisting with committee work
and staffing the professional development sessions at the booth. I hope that all
members will continue to think of ways they can engage more actively with our
organization – we have a lot of initiatives and ideas, but not always enough
hands/minds to carry them out.
And what a successful 25
th year it was! As usual, we
had a full house for the Social and Poster session,
with an estimate of more than 650 people in
attendance, over 160 posters and 21 travel award
winners! We had a wonderful opportunity to thank
Sally Frutiger, Stephen George, Julio Ramirez and
Dennison Smith for starting FUN 25 years ago! We
also had record sales at the booth with over $12,000
in sales! Thank you to all that participated in the
Social, supported student poster presentations and
staffed the booth throughout the week.
I look forward to our 26
th year, working with new and returning members of the Exec
committee and with all of you, and I can’t wait to see many of you again soon at the
2017 FUN Education Workshop at Dominican University July 28-30
th. To use the
acronym one more time – FUN people are my people, and I’m so thankful to share in
the organization with you. – Amy Jo

And what a successful 25th year it was! As usual, we had a full house
for the Social and Poster session, with an estimate of more than 650
people in attendance, over 160 posters and 21 travel award winners!

Past-President’s Corner:
FUN Newsletter
Faculty for Undergraduate Neuroscience January 2017
Inside this Issue
Award Winners pg 2
Try ESCI pg 3
Want a PUI Job? Pg 5
JUNE Editor’s Corner pg 8
Assessment pg 10
Summer REU pg 11
SFN NDP Survey pg 12
Upcoming Events pg 13

Elizabeth Becker,
Saint Joseph’s
University

Carlita Favero,
Ursinus College

Charles Weaver,
Saginaw Valley
State University

Jade Zee,
Northeastern
University

Newsletter Staff
2
2016 FUN Equipment Loan Program:
Elizabeth Krusemark, Millsaps College "Examining
Cognitive and Affective Mechanisms of Information
Processing in Psychopathology: A Proposal for Educational
and Basic Research Growth at Millsaps College" Award:
ADI ML856 PowerLab 26T system for research-quality data
collection and educational demonstration from

ADInstruments
Deanne Buffalari, Westminster College "Reinforcement
enhancement properties of nicotine during reinstatement of
cocaine conditioned place preference driven by stressful and
cocaine-paired stimuli" Award: Conditioned Place
Preference Equipment from
San Diego Instruments
Greg Butcher, Thiel College "The Cognitive, Behavioral,
and Neural Consequences of Adolescent Nicotine Exposure
in Adult Long-Evans Rats" Award: An elevated plus maze
from
San Diego Instruments
Joshua Cordeira, Western Connecticut State University
"Investigating Basal Forebrain Control of Prepulse
Inhibition" Award: SR-Lab Startle Response System from

San Diego Instruments
Jennifer Tudor, Saint Joseph’s University " Impact of sleep
deprivation on mechanisms of translation regulation"
Award: Any-maze Video Tracking System, zero maze,
multi-unit open field maze from
San Diego Instruments
Maureen Rutherford, Indiana University Northwest
"Long-term neuroendocrine and behavioral effects of
prepubertal Prozac® (fluoxetine) exposure in Zebrafish
(
Danio rerio)" Award: EthoVision XT software from
Noldus Information Technology
FUN AWARD WINNERS!
Lorenz Neuwirth, SUNY Old Westbury "The effects of
developmental Pb neurotoxicity on brain excitability"
Award: Implantable telemetry system for data acquisition
and analysis from
Data Sciences International
2016 FUN Travel Award Winners:

Zackary Bowers Saginaw Valley State University
Lindsey Chew University of Arizona
Sarah Chiren Lake Forest College
Margot DeBaker Marquette University
Matthew Downer Memorial University of Newfoundland
Chloe Erikson Washington State University
Miguel Alfonso
Epinosa
Queens College, CUNY
Ashley Goreshnik Lafayette College
Kelsey Idyle Central Michigan University
Anvita Komarla University of California, Davis
Lindsey Erin Miller Middlebury College
Anna Miller University of Wisconsin River Falls
Alexis Monical Marquette University
Hannah Radabaugh University of Pittsburgh
Taylor Redmond Drew University
Justin Sabo Baldwin Wallace University
Jack Sternburg University of South Dakota
Dan Sangiamo University of Delaware
Tyler Milewski University of Scranton
Meggan Archey St. Edward's University
Lauren Vetere University of Florida


2016 FUN Faculty Award Winners:
Career Achievement Award Winners:
Wesley P. Jordan and Douglas Weldon
Carol Ann Paul FUN Educator of the Year Award:
Elaine Reynolds
FUN Service Award:
Jeffrey Smith
FUN Mentor Award:
Lora Becker

Bruce Johnson has been
awarded the Society for
Neuroscience Award for
Education in
Neuroscience!

Congratulations Bruce!
3
Like any other science, neuroscience constantly requires
making good judgements about the world based on representative
samples. There are many, many software tools available to assist with
this process of statistical inference. But in terms of training
undergraduate neuroscientist, it can be hard the perfect tool. SPSS is a
common choice, but it is expensive, has a steep learning curve, and
produces figures which are both ugly and uninformative. R is growing
in popularity, is free, and works across platforms, but it also has a
steep learning curve and is often over-powered relative to
undergraduate teaching goals. There are lots of “stats light” programs,
but most are proprietary and expensive.
One option worth exploring is
ESCI (Exploratory Software for
Confidence Intervals)—a free set of Excel worksheets developed by
stats guru Geoff Cumming to make statistical inference intuitive and
easy for undergraduates (Cumming, 2011). ESCI runs on PC and Mac,
and students can easily download their own version for free to run on
their own computers. Data is simply be pasted into a worksheet,
instantaneously producing the results and a compelling visualization.
What makes ESCI worth investigating? First, it is very easy to use. I
use it in a class that draws both stats-oriented psych majors and statsphobic bio majors. I’ve found that both groups of students can very
easily learn to use ESCI for lab write-ups, and that it supports very
sophisticated discussion of their statistical inferences. The figure
accompanying this article, for example, shows the interface for
comparing two repeated measures groups (a classic paired
t test) from
data undergraduates in my lab collected (Herdegen, Conte, Kamal,
Calin-Jageman, & Calin-Jageman, 2014).

To what extent do two
independent groups differ?
(classic independent groups
ttest)
Data two tab with raw data,
Summary two for summary data
To what extent do scores change
across two repeated measures?
(classic paired
t-test)
Data paired tab with raw data,
Summary paired tab for summary
data

To what extent do 2-6
independent groups differ?
(classic one-way independent
groups ANOVA)

Ind groups contrasts tab with raw
data or summary data

To what extent do two variables
interact? (classic 2x2 independent
groups ANOVA)

Ind groups 2 x 2 tab with raw
data or summary data

To what extent are scores on one
quantitative variable linearly
related to scores on another?
(classic Pearson’s
r and linear
regression)

Scatterplots tab with raw data,
One correlation tab with
summary data
To what extent does a linear
correlation vary across two
groups?

Two correlations tab
To what extent are two
dichotomous variables related?
(classic Chi square)

Two proportions tab
To what extent do two groups
differ along a variable with a
outliers?

Robust two tab with raw data
What sample size would be
appropriate for my study?

Precision two tab for
independent groups design;

Precision paired tab for repeated
measure designs

Plus, ESCI has several tabs for
meta-analysis which are actually
easy enough for undergrads to
use and understand.

Here’s a set of research
questions and the
corresponding tab in ESCI
that will help provide an
answer:

ESCI: A free statistical
analysis program perfect
for neuroscience majors

Bob Calin-Jageman, Dominican University
Continued…
4
JUNE Editor’s Corner
Data and labels are entered in the cells
on the left, and Viola!—you instantly
obtain a beautiful figure emphasizing the
group differences, descriptive statistics,

t test results, a raw effect size with CI,
and a standardized effect size with its
CI. It can take a few moments to help
students understand each aspect of the
output, but in my experience students
learn ESCI very quickly.
Another reason to give ESCI a try
is that it emphasizes best practices for
statistical inference: 1) every analysis is
accompanied by an informative figure to
help visualize the effect of interest and
2) every analysis provides not only a
p
value but also an effect size estimate
and a confidence interval (APA, for
example, recommends these as essential
in the reporting of inferential statistics).
So you can give your neuroscience
students a “stats light” piece of software
secure in the knowledge that it’s actually
quite stats advanced.
Oh yeah, did I mention that ESCI is
free?
ESCI has been developed and
improved by Geoff Cumming in multiple
versions over the past 15 years. It works
pretty darn well now. The latest version
was developed to accompany an
undergraduate statistics textbook which I
helped co-author. While of course I’d
love to encourage adoption of the book,
ESCI was, is, and will remain a fully
independent project—your students can
download it, learn it, and use it all on its
own.
Of course, ESCI can’t do it all—it
is certainly not a complete replacement
for SPSS or R. The list of what it can do,
though, is surprisingly extensive (see the
list at the end of this article). I find that
ESCI hits the sweet spot for neuroscience
majors, who certainly need to become
savvy about statistical inference, but
who may not have the space in their
course load for the same level of
statistical training as a psych or
informatics major. ESCI is also so nonthreatening that it can be easy to
encourage other faculty in your program
to adopt it, meaning your majors can
have a seamless experience with
statistical analyses with a minimum of
cost and fuss. etc.
If you try out ESCI or end up using it in a
class, please drop me or Geoff a line
(
[email protected] and/or
[email protected]) to let us
know, especially if you run into any bugs
or have any feedback, feature requests,
etc.

References
Cumming, G. (2011). Understanding the New
Statistics: Effect Sizes, Confidence
Intervals, and Meta-Analysis
. New York:
Routledge.
Herdegen, S., Conte, C., Kamal, S., CalinJageman, R. J., & Calin-Jageman, I. E.
(2014). Immediate and Persistent
Transcriptional Correlates of Long-Term
Sensitization Training at Different CNS
Loci in Aplysia californica.
PLoS ONE,
9(12), e114481.
http://doi.org/10.1371/journal.pone.01
14481

CONTRIBUTE
TO THE NEXT
FUN NEWSLETTER!

We welcome submissions on any topic
suitable for the FUN membership including:
Editorial – an opinion piece on an issue or
topic relevant to the advancement of FUN’s
mission.
I wish I’d known then – advice you wish
you’d been given related to teaching
neuroscience, career development,
managing research or other topics relevant
to FUN membership
Resource Pointers/Reviews – summary and
review of a teaching resource you find
useful (book, article, video, website, etc.)
Ask FUN – a question on which you seek
feedback from the FUN community (e.g.
grading dilemma, managing work-life
balance, etc.)
Other – submitted articles directly relevant
to FUN membership may be solicited or
accepted for publication.

Please submit your article via email to
[email protected]

5
What can you expect from a job at a Primarily
Undergraduate Institution (PUI)?

Luke Daniels and Katherine Mickley Steinmetz
At the Society for Neuroscience Annual
Meeting this past November, FUN faculty
held several informal meet-up sessions.
This article summarizes a few recurring
themes from our conversations at the
session:
Do you want, and how do you
get, a job at a PUI?
We’ll focus here on
the “Do you want” part of this question,
as a previous newsletter article
(How to
get a job at a Primarily Undergraduate
Institution
; see resources) discusses the
process of preparing a competitive
application and navigating the interview
process. Many other resources are
available that highlight the PUI work
environment; some are provided below.
As always, feel free to contact FUN
members directly if you have questions.

What can you expect from a job
at a PUI?

During the SFN session, three themes in
particular emerged from our
conversations. These are briefly (and
certainly incompletely) described here.
We note that these themes are not
unique to undergraduate institutions,
and that faculty at other institution
types may have similar experiences and
expectations, whether working with
undergraduates or graduate students.
Institutions vary considerably in their
mission and expectations of faculty
teaching and research. However faculty
at a PUI can expect to encounter the
following, regardless of mission or
relative weight given to faculty duties in
teaching, research, and service.

At a Primarily
Is working at a PUI a good career choice for you? Find out
by using the chart below, developed from the themes
discussed at the SFN session!

Do you want a job at a PUI?
Continued…
At a Primarily Undergraduate Institution
(PUI):

1. You may be the only expert in your research
area and you almost certainly will teach on topics
outside your core area of expertise.
Maybe your
expertise is cell biology; at some institutions you
might be the only faculty member that knows how to
culture cells. You may be trained in
electrophysiology—it’s possible that you might be the
only person in town (or even within 200 miles if
you’re not in a densely populated urban area!), that
knows how to voltage-clamp. In any case, even at the
largest PUIs you are likely to be one of a very small
few subject matter experts in your field. This means
that if you want to have high level research intense
conversations or bounce ideas off of another expert,
you must do so via collaborations, at conferences,
etc. This environment can be extremely rewarding!
Your class sizes may be very small (10-20 students or
less) at the upper-division level in your specialty
area. However the trade-off might be that you will
also need to teach larger courses (40-100 students)
that stretch you professionally (a cell biologist may
also teach Genetics or Human Anatomy and
physiology). Do you enjoy learning broadly about
academic areas that are not squarely aligned with
your training? If so a PUI will give you plenty of
opportunity to grow in this way.

2. You’ll likely encounter a diverse student
population, especially with regard to academic
preparation and subject-matter interest.
Most PUI
faculty teach introductory or non-majors classes
regularly as part of their teaching load. You may
have a student that is majoring in English taking your
Introductory Biology course to fulfill their only
science course requirement. If you are interested in
teaching at a PUI, you should consider how you can
directly engage these students, and even mention it
as part of your application package and/or interview
responses. Do you use engaging case studies in class?
Do you use examples from pop-culture? Analogies
that students can relate to from every day
experience? Make sure to point this out as you apply
and are invited for interviews. Consider proposing to
teach a course that directly meets the needs of
several types of students—music and science, science
writing, or a non-majors course on science vs.
pseudoscience. Courses that involve a collaboration
with a current faculty member outside of your home
department may be a welcome suggestion as well.
PUIs value innovative courses that reach a broad
section of the student population, and showing an
interest in teaching these types of courses is a
positive indicator for a candidate even if it’s not in
the job description.

3. You’ll likely have a lot of autonomy in
curriculum design and how you teach your classes,
both in the classroom and the teaching lab
. A
challenge for faculty at PUIs is to engage not only the
science majors, but also novice students with little
prior experience or interest in the course topic.
Innovative teaching methods are valued at PUIs. You
may have access to prior curricula and lab exercises,
but if not you’ll need to design your own courses and
labs, possibly by using educational literature (like
JUNE or other education-focused journals),
commercially available curricular resources (from
textbook companies or vendors), resources made
available on the internet by other like-minded
faculty, your professional networks, and/or your own
ingenuity. As you modify existing curricula or design
your own, you should consider how to assess these
changes for effectiveness. There are many ways to
get feedback about whether student attitudes about
science, skills, or mastery of concepts are improved.
You don’t need to necessarily plan a publicationworthy analysis (in fact you need to get Institutional
Review Board approval to collect student data that
you would consider publishing). Even a simple midsemester survey about whether students enjoy a class
project and what they find challenging can give you a
great deal of information on whether your techniques
are successful. At a PUI, this cycle of teaching
innovation, self-assessment, and improvement in
pedagogy is valued—and it may be beneficial to you in
your job search to discuss how you plan to
incorporate these into your classes.

Continued…
“Your research, like
your teaching, will
center around the
experience of
students”

4. Your research, like your teaching, will center around the experiences
of students.
This isn’t to say that you won’t make progress on a research
agenda. Faculty at primarily undergraduate campuses can (and do!)
produce very high quality work; however it is much slower than you may be
used to. Undergraduates work on research projects during the academic
year around their class (and often sports and work) schedules. They may be
able to only commit to 3-10 hours a week in the lab (with a few extremely
dedicated students working up to 20 hours a week) during the school year.
At some PUIs, students may be able to work full-time during the summer if
you are able to secure a grant to support research activity. In any case, the
obstacles to advancing a research program is the usual challenge of time
and money. At a PUI, you will spend a great deal of time training an
undergraduate to do an experiment before they become independent, and
the turn-over is very high as they progress through their major and
graduate. Funding mechanisms are generally not as large as at researchoriented institutions. For example, PUIs are generally not eligible for R01
NIH awards—thus you won’t be working with multi-million dollar research
budgets. You can expect to be involved in the ordering of supplies for your
research lab, and trouble-shooting microscopes, recording rigs, or data
analysis tools. If and when you begin to prepare a publication, you will be
writing (or very closely supervising) the writing of the manuscript, and
making figures. You may be thinking—what is the upside?! At a PUI the
mission of the institution is teaching students, and this includes teaching
students how to do research. The upside is that at a PUI, the teaching of

how to do research is as important as the research itself, and this can be
just as rewarding as classroom teaching. The authors of this article have
travelled with students to conferences (including with a student on their
first airplane flight!), and mentored students through the process of
carrying out a project and writing up a manuscript for publication. These
experiences can be a pivotal part of a students’ experience in college, and
being able to participate with them is a fantastic part of a job at a PUI.

Summary:
The key to being happy at a PUI is that you must truly enjoy teaching and
appreciate the challenge of teaching students at a variety of different
levels. You can expect that this focus on teaching will extend to your
research program, where you will teach through research. It can be very
rewarding, but is often different from the research-focused grad school
experience.
Steinmetz, K.R.M.
How to
get a job at a Primarily
Undergraduate Institution.

Faculty for Undergraduate
Neuroscience (FUN
Newsletter, January 2016,
Volume 3, Issue 1
.
https://www.scribd.com/do
cument/295915745/funnewsletter-2016-01-v03i01

Anastasio, A. Working at a
PUI
. American Society for
Biochemistry and Molecular
Biology Today, January,
2016.

http://www.asbmb.org/asb
mbtoday/201601/CareerInsi
ghts.

Peaslee, G. Teaching at a
Primarily Undergraduate
Institution
. American
Chemical Society, Graduate
and Post-Doctoral Chemist,
August 2016.

https://www.acs.org/conte
nt/acs/en/education/stude
nts/graduate/newsletter/pe
rspective-from-a-facultymember-at-a-primarilyundergraduate-i.html

Austin, R.N. Preparing for a
PUI Career
. AAAS Science
Magazine Online, Advice,
Issues and Perspectives,
March 2, 2012. DOI:
10.1126/science.caredit.a12
00026.

http://www.sciencemag.org
/careers/2012/03/perspecti
ve-preparing-pui-career

Resources and
Perspectives on the
environment at a
PUI

Candles in the Dark
Bruce R. Johnson, Cornell University, JUNE Editor
JUNE Editor’s Corner
Continued…
The Fall 2016 JUNE issue
continues our FUN journal tradition of
disseminating the best teaching tools and
ideas of Neuroscience educators. The
JUNE editors invite our FUN colleagues to
view the new JUNE issue

(http://www.funjournal.org/currentissue/) published just before the SfN
meeting. The following description of
this JUNE issue’s content is adapted from
my editorial for this issue (Johnson,
2016). The issue starts with a
description of the history, mission and
evolving format of our “Amazing Papers”
review section (Harrington et al.). Three
of the “Amazing Paper” submissions (by
Kennedy, Sable and Cecala) are
highlighted as examples of how this
section is broadening its instructional
content. Two additional “Amazing
Papers” contributions showcase primary
research articles with instructionally rich
content: 1) a paper showing that genetic
manipulation and environmental
enrichment can both influence the
performance of learned behavioral tasks
in mice (Flinn), and 2) the demonstration
that a sensory system can adapt quickly
to changing environmental conditions
(Bies). An opinion piece discusses the
recruitment of students to help plan a
new neuroscience course, with positive
educational, personal and practical
outcomes for students and faculty
(Birgbauer).
The first of the 14 full articles
continues a productive thread by Ramos
et al. of analyzing the impact of
undergraduate neuroscience programs on
student experiences and career
decisions. Here the focus is on the
importance of neuroscience as a life
science major. We welcome our first
JUNE article on Neuroscience graduate
education. Harrison et al. describe a
literature-based grad student course to
familiarize new students with modern
research methods. “Expert” advanced
grad students support beginning grad
students in presenting research methods.

Six articles present new laboratory
exercises or evaluate lab techniques. In
two of these, students determine the
modification of taste sensations by plant
extracts:

1) the reduction of perceived
sweetness by the herb
Gymnema
sylvestre
(Aleman et al.), and 2) the
ability of the Miracle Fruit (
Synsepalum
dulcificum
) to make sour foods taste
sweet (Lipatova & Campolattaro). Two
more human exercises examine tactile
sensitivity protocols to distinguish
peripheral and central sensory processing
(Lowe et al.), and report a student
evaluation of EEG methods for cost
effectiveness (Shields et al.). Quiroga
and Price describe a simulation exercise
that allows students to “record” the
firing properties of a virtual, motion
sensitive neuron with realistic
physiological properties. In the last lab
article, Lemons presents a “mystery
mutant” exercise for students to
determine the synaptic site of a
behaviorally disruptive mutation in
C.
elegans
. Other main articles describe
the effectiveness of the structural
assessment of knowledge approach (SAK)
for evaluating Neuroscience learning
(Stevenson et al.), a comparison of
instructional rubrics with other methods
of teaching scientific writing (Clabough &
Clabough), the use of social media to
engage students in understanding and
then disseminating neuroscience content
to the general public (Valentine &
Kurczek), a student internship with
neuropsychological techniques at a
neurotraining center (Schicatano &
Bohlander), and a word origin library to
teach complex Neuroscience terminology
(Hillock et al.). The last two main
articles in the issue are based on talks
presented at the teaching symposium,
“Teaching Neuroscience to NonScientists”, organized by Dr. Richard
Olivo of Smith College for the 2015
Annual Meeting of the Society for
Neuroscience in Chicago, IL. These
invited articles report the success of
courses using popular Neuroscience
literature (Been et al.), and
Neuroscience related cultural or news
themes (Roesch & Frenzel) to teach
Neurobiology content.

References:
Johnson, B. R. (2016) Candles in the dark. J. Undergrad. Neurosci. Ed. 11(1), E6-E7.
Mitchell CS, Cates A, Kim RB, Hollinger AK (2015)
Undergraduate biocuration: Developing tomorrow’s researchers while mining today’s data. J Undergrad
Neurosci Educ 14(1): A56-A65.
Sagan C (1996) The demon-haunted world: science as a candle in the dark. New York, NY: Ballantine
Books.
Schaefer JE (2016) The BRAIN initiative provides a unifying context for integrating core STEM
competencies into a neurobiology course. J Undergrad Neurosci Educ 14(2): A97-A103.

“Carl Sagan reminds me that
we scientists/educators can
help bring rational light into
a world easily influenced by
the darkness of human
ignorance”

Our new “Case Studies” feature, edited by
Leah Roesch and Kristen Frenzel, continues with 2
new case presentations that use clinical themes to
teach basic neuroscience content. These
educational cases center on a patient born without
a cerebellum (Brielmaier) and a patient with a
retinal degenerative disorder (Ogilvie & Ribbens).
We present 4 book reviews in this JUNE issue:
Schatz’s
A Matter of Wonder: What Biology Reveals
About Us, our World, and our Dreams (
Kalat),
Bouton’s Learning and Behavior: A Contemporary
Synthesis, 2
nd edition (Meyers-Manor), Kingdom and
Prins’
Psychophysics: a Practical Introduction, 2nd
edition (Cecala), and Luo’s Principles of
Neurobiology
(Hoy).
I draw attention to our JUNE “Editor’s
Choice Awards” for especially noteworthy papers
appearing in last year’s JUNE issues (Vol. 14, 1 &
2). A subcommittee of our JUNE editorial board,
organized by Barbara Lom, choses 2 of the full
articles published in each year’s JUNE issues for
these awards. For “Outstanding Neuroscience
Pedagogy Article”, we chose “
The BRAIN Initiative
Provides a Unifying Context for Integrating Core
STEM Competencies into a Neurobiology Course
”.
This paper describes an undergraduate
neuroscience course teaching content in the
context of the Brain Research Through Advancing
Innovative Neurotechnologies (BRAIN) initiative,
while focusing on core STEM competencies
(Schaefer, 2016). Our award for “Outstanding
Neuroscience Laboratory Article”, “
Undergraduate
Biocuration: Developing Tomorrow’s Researchers
While Mining Today’s Data
”, describes the training
of undergraduates to curate biological and clinical
data in user-friendly data bases for informatics
analyses (Mitchell et al., 2015). These and other
award winning examples of creative initiatives in
Neuroscience education can be seen at:
http://www.funjournal.org/archives/.

In his book, “The Demon-Haunted World: Science as
a Candle in the Dark”, Carl Sagan (1996) reminds me
that we scientists/educators can help bring rational
light into a world easily influenced by the darkness
of human ignorance. This darkness is expressed in
many forms, such as superstition, racism, misogyny,
and in the denial that humans can adversely affect
life on this planet. Deep cruelty directed to the
“other” is often stirred up by the fear mongering of
voices emerging from this darkness. We teach our
students to think critically through learning the
scientific process and the tools used to gather new
knowledge, through learning our organized
conventions of presenting ideas, results, and
conclusions through talk and prose, and through
using the concepts and tools of Neuroscience
creatively to generate new knowledge. Sagan’s book
helped me appreciate our important service to
recruit students to carry the “candle light” for
science, as well as for their thoughtful and rational
navigation of our world.

Program assessment seems to be one of the most
hated aspects of running an undergraduate program,
particularly in the sciences. So much so in fact, that
many of my colleagues refer to it as “the A word.”
This has always struck me as odd, since
experimenting, analyzing new data, and trying
another experiment is what we do every day as
scientists. Assessing your curriculum is equivalent to
implementing an experimental control that tells you
whether your experiment is working properly. During
my recent conversations with other FUN faculty about
program assessment at the SfN meeting, this was the
dominant theme of conversation as summed up in the
questions “How do we make assessment less painful?”
and “How do we get our colleagues to participate in
assessment?”
I’m not sure why assessment has such a loathsome
reputation among faculty. I suspect it has a lot to do
with the common perception of assessment as “more
work” and another “pointless” directive from
administration. In reality, assessment of your
program is something that can be done with very
little extra effort and when directed by the program
itself, can provide very meaningful information about
teaching and learning.
Reducing the work of program assessment begins with
aligning program objectives and goals to the courses
themselves, and to individual assignments within the
courses. In my experience, programs that struggle
with assessment often find that they don’t really
know what they want students to know. An
important first step in every assessment cycle is to
ask what the program goals really are, and
subsequently, are those goals (and any associated
student learning outcomes) reflected in the
curriculum of the program? Once the program goals,
objectives, and outcomes are clear, constructing a
curriculum map that shows how each program course
aligns with the program objectives can be a good way
to identify potential sources of evidence for
assessment.
The variety of potential questions for assessment is
also daunting. It can be difficult to even know where
to start. In our program, we have developed an
assessment plan that addresses a subset of our goals
and outcomes each year as part of a three to five
year cycle. We also have given ourselves the
flexibility to push back on the schedule if a pressing
question arises.
In programs with aligned goals and outcomes, the
data for assessment typically comes from the various

Assessing Your Neuroscience Program
Samantha Gizerian, Washington State University
courses. This data can be found in course
evaluations, whole assignments, student responses to
individual exam questions, rubric-based scores, etc.
If instructors are already archiving these sources,
they simply need to pull out the relevant data and
analyze it with respect to the assessment question.
As an example, in our program one of the learning
goals is that students will have “
An understanding of
major neuroscience concepts and an awareness of
how these are connected from the molecular to the
systems level
” when they graduate. We assess this
directly in our seniors as part of the evaluation of
their capstone project. In the capstone project, each
student must present a research project (lab or
literature-based) that uses neuroscience to address a
real-world problem. One of the scoring criteria for
the project is how well they connect the molecular
and systems levels of their project. The rubric we
use to score their presentations therefore has an
individual category that evaluates this criterion.
When we assess how well our program is doing with
regard to this learning goal, all we have to do is look
at the scores of our seniors in that particular box of
the rubric each year and compare that with other
measures.
Getting your colleagues to buy into the process of
assessment is often difficult. Concerns about
workload, administrative ulterior motives, and the
value of the work itself are high barriers to be
overcome. One approach that has been successful for
me over the years has been to focus assessment
questions on the needs of individual faculty members
and their courses. When skeptical colleagues
discover that the process of assessment can benefit
them individually as well as the program, they tend to
become more willing to participate in the process
because they see the value in the work. Moreover,
when the questions for assessment are generated
from within the program, there is less concern about
the motives of administrators in demanding
assessment reports.
Assessment doesn’t need to be the terrible, pointless
chore that so many faculty perceive it to be.
Program assessment, like an experimental control,
can be a useful tool to determine whether or not your
curriculum is working. With a little planning and
communication, you can implement sustainable,
effective assessment that has value for your program
and meets the assessment requirements of your
institution without being a burden.

Research experience is a fundamental piece
of an undergraduate neuroscience
education, particularly for students who are
pursuing a career in the field. At the same
time, research labs can benefit from the
work an undergraduate can provide. Many
institutions offer formal summer research
programs for undergraduates, and we
maintain a list of these opportunities on the
FUN website.

http://www.funfaculty.org/drupal/undergra
d_internships_neuroscience

This internship list helps empower students
to find opportunities and provides a venue
for our membership and others to advertise
their programs. This page is consistently the
most visited page on the FUN website.
Of course, this list is only as useful as its
content. We appreciate any research
opportunities that you and your colleagues
can provide us with. Just send an email to
me (
[email protected]). Include the name of
the opportunity you would like listed (or just
the institution at which it is housed) and a

Post Your Undergraduate Summer Research Opportunities With
FUN

Jared Young, Mills College
link to the website where more information
can be found. In addition to posting these, I
(or one of my students) periodically comb
through the list, repair/remove old links,
and update application deadlines. Any help
you can provide with this (emailing when a
link changes or a program is no longer being
run, updating us with new application
deadlines) is most helpful.
If you do not have a website, but would like
to make a document available to students
(like a flyer with information on the
program), I can also do that. Just send what
you would like posted to my email.
This link has served many students and
programs well in the past and we hope it
continues to do so for many years to come.

2016 Survey of Neuroscience
Departments and Programs

From the Society for Neuroscience Website:
A survey of SfN’s Institutional Program members and other
neuroscience departments and programs addresses a variety
of programmatic issues concerning graduate students,
postdoctoral fellows, faculty, financial support, and
neuroscience curricular and training issues. More recent data
are compared with the results of earlier surveys, providing
useful longitudinal perspectives.
Prior to 2009, the survey was conducted by the Association
of Neuroscience Departments and Programs (ANDP). Since
the consolidation of ANDP and SfN in June 2009, the survey
was administered by SfN.
In November 2016, SfN is relaunching its regular survey of
Neuroscience Departments and Programs (NDPs).
This study will collect data on program details such as:

? The administrative and financial structure of NDPs
? Training and curricular issues
? Program enrollments and demographics
This study will also look at:

? Information on the number of applicants to NDPs
? Student support in training programs
? Students' careers after completing neuroscience
training program

? Faculty metrics
2016 NDP Survey Participants: While the NDP Survey
collects aggregate information for postdoctoral
trainees and faculty, you have the opportunity to
upload a file of individual level data for postdocs and
faculty affiliated with your department or program
through the survey link that you were provided with.
This will allow a deeper look into post-doctoral
trainees and faculty in terms of ethnicity, gender,
advanced degrees, academic rank, tenure status, and
other metrics.

If you would like to provide this level of detail for
your program, please go to the link below for more
information.

https://www.sfn.org/careers-and-training/facultyand-curriculum-tools/training-program-surveys
Data to be Collected
Structure of Neuroscience Programs
Training and Curricular Issues
Program Applicants
Program Enrollment, Demographics, and Metrics
Post-Doctoral Training
Students’ Careers After Completing Neuroscience PhD
Programs
Graduate Student Support in Neuroscience Programs
Faculty

“SfN conducts a
regular survey to
monitor the status of
neuroscience training
programs in the U.S.
and abroad, as well as
to identify emerging
trends.”

CALLING ALL
PAST & CURRENT
EQUIPMENT LOAN WINNERS!

We would like to collect stories/outcomes from our
equipment loan winners. These will be used to
strengthen and/or establish new relationships with
sponsors.

Please contact Leah Chase with your story via email:
[email protected]

Plan to attend the 7th FUN Undergraduate Neuroscience
Education Workshop. Come together with other FUN
members to explore new inquiry-based labs that can be
integrated into your curriculum, to discuss the latest in
neuroscience peda-gogy, to exchange ideas about
career development, and to participate in sessions on
grantsmanship, program evalua-tion, assessment, and
so much more. Best of all, you'll be spending several
days with FUN members—folks who know the triumphs
and tragedies of teaching the Nernst equation, who have
navigated their neuroscience programs through the
cross-fire of competing departments, and who have
been in the front lines of engaging students in
meaningful neuro-science research. The best summer
weekend of conversation, commiseration, and
inspiration, guaranteed!
As an added bonus, the 2017 FUN workshop will take
place in bucolic River Forest, IL, just 10 miles from
downtown Chi-cago. Plan to arrive early or to stay late to
explore downtown Chicago (a short trip on the green
line), the Frank Lloyd Wright District (in neighboring Oak
Park), or the shores of Lake Michigan. Easy travel to
O’Hare International Airport, just 9 miles away.
The FUN Conference only happens every 3 years!
Mark your calendars today!
Suggestions or nominations for the program?

Email Irina Calin-Jageman: [email protected]
Upcoming Events
SAVE THE DATE
Tri-annual FUN Workshop
“Activities, Laboratories, and Best Practices for
Developing , Assessing, and Sustaining Inclusive Curricula”

Dominican University
Pre-Workshop Lab Training July 27-28
Main Workshop July 28-30

? 02/20/2017 - 02/21/2017
Free S-STEM Capacity Building
Proposal Writing Workshop at
Rice University

? 05/20/2017 - 06/25/2017
Neuroscience in Salamanca Spain
Study Abroad - Summer Program

? 06/12/2017 - 08/04/2017
SysNeuro: A Summer study
abroad program

? 07/28/2017 - 07/30/2017
FUN WORKSHOP

AttachmentSize
January_17_FINAL.pdf1.28 MB