Wonderful example of transparency about academic hiring process in two tweets:
In sports and training there is a concept of “becoming an athlete before starting sports career”. We hear a lot about very successful prodigies who start to specialize in particular sport (or other activity) early on. But for professional athletes and definitely most of the people it is essential to develop strong base of general physical preparedness known as GPP.
Today we’ve been reminded that similar General Professional Preparedness is important at work too:
This list is excellent, and applicable to wide range of professionals. For science undergrads, PhD students, and postdocs we can have extended list (on top of the linked above):
- Experimental design and agile troubleshooting
- Collaboration on publications (especially as “last” author or the most interested party)
- Conflict resolution and negotiation skills
- Writing skills (including writing/publishing alone or just without PI)
- Mentoring skills (including sponsorship skills)
- Teaching (formal and informal, for all levels: senior colleagues and juniors mentees)
- Presenting skills (including job interviews)
- Searching for jobs (starting at undergrad level, e.g. writing CV, cover letter, reaching out to PhD advisers, finding the right fit)
- Searching for employees and interviewing candidates
- Asking for help, and efficiently seeking advice/support
- How to pitch (cover letters to editors, funders etc) and deal with rejections
- Basic employment skills
All these skills are “taught” during scientist’s career, but almost never formalized or emphasized by the supervisors. For example, writing is taught through editing and excessive use of red ink. Mentoring is almost never taught, and just passed along as bad (or good) habits from person to person. Learning how to search for a job is almost never part of the academic process as well. A lot of these skills, however, are discussed especially on Workplace StackExchange.
Academic groups, especially PIs, have no time and inclination for most of this work, and perhaps consider it to be extra-curricular activities. However, these skills are really the core of any professional performance. If PIs are not willing to invest personal time into teaching these skills, they should at least emphasize that these skills can’t be acquired without practice and focus, yet still essential.
Previously we introduced “light” version of Paired Sciencing: just teaming up with a colleague in a quiet conference room and focusing on tasks at hand, not necessarily shared work.
There is also extended version, where actual science is done in pair. Activities can include: design of experiment, writing a grant, actual benchwork or microscope alignment, and many more.
In academia we often work alone on our tasks, occasionally meeting to discuss results or plans. That creates huge distance between design and implementation of the plan. Working in pair with trusted colleague will ensure that obvious mistakes are caught early on. This is especially important in biological experiments when protocols can take days to finish.
Work in science, especially at “high” level (whatever it means to you) produces limited number of valued artifacts. Mostly they include peer-reviewed papers, invited reviews, fellowships and grants, PhD theses, papers and posters at conferences.
However, that is not enough. Much of the knowledge goes unnoticed, unrecorded, unorganized, and unchecked because we don’t value a lot of artifacts. Papers that reproduce published work are not valued and rarely noticed. Work and experience of technicians is not valued because of lack of formats to make it visible. Troubleshooting work that requires expertise and patience is not valued, because it is mostly done in the silence of the lab.
Making new formats for recording artifacts of scientific process will make it easier to show work and achievements and highlight what practices are accepted by the community.
We can compare it to software development. In the recent past the only value was in LOC – lines of code, namely KLOC as in “1000s of lines of code”. Then came time of Test-Driven Development, where (over-eagerly sometimes) most valuable became “test coverage %”. World of Agile brings forward “stories” that needs to be filled as quickly as possible. DevOps people value uptime and latency metrics. Startup companies value user base size and acquisition rate. None of these metrics (artifacts) are perfect, but they provide range of possible goals to aim for.
Software engineers have tons of metrics and artifacts they can produce that potentially capture value and represent useful work. Scientists have far fewer.
Modern science need to start producing – and valuing – more artifacts. We are currently learning to better acknowledge work of Core Centers / technology providers; role of open-source software; role of pre-prints and open peer-review; role of open-access publications. We need to highlight value of other items such as:
- technical articles
- assembly instructions for custom hardware that might not be publishable
- first-hand experimental protocols and checklists from techs/students who actually do the work
- meta-scientific articles and tools (how to make sciencing better)
- records of collaboration and technical assistance/consulting
- record of upgrading or reviving old protocols/tools/equipment
- work to make other experiments possible (e.g. building workstations or experimental setuplets)
- experiment design (perhaps through pre-registration)
- teaching and mentoring examples
The only requisite, really, is that these artifacts need to be preserved, potentially archived (so personal blogs are out 🙂
Every lab (or PI) should be able to define a set of valuable artifacts and find medium to make them public. It is OK to care about different things, but right now there is no shared set of such artifacts, or the list is too short. By elevating some of these artifacts we can fine-tune the research process to deliver value faster and more efficiently.
Science, in general, consists of two steps performed in a loop: Pick something to study; Study it; Pick next thing to study.
Blog Extreme Sciencing is focused on the second term of this process: how exactly do we perform the study? Is there better approaches to science process? Can we borrow techniques from other fields of human experience?
The first question is very important as well. There are roughly two approaches to finding something to investigate, especially at the beginning of the career:
- Postmodern approach: “Listen to the conversation” and decide what is missing or wrong, and fix it, thus moving conversation forward
- Artistic approach: wait for enlightenment from the higher power to show you the path, and follow that path thus being untethered from mainstream that can be wrong or a fashion
This is a philosophical distinction, and thus there cannot be “the best” approach. But that distinction is critical to understanding modern problems of science.
The first approach (where we read the current literature, listen to scientists, and figure out what is missing) has at least two problems. First, the “conversation” can be extremely noisy, or just plain wrong. People still can’t explain exactly how they performed the experiments and provide enough data for replication! So when you read paper about something being correct or wrong, this paper often cannot be trusted. Accumulation of noisy information can provide more reliable picture, but with ultra-specialization of science there might not be enough samples for averaging. Secondly, the conversation can be about something shitty, like Deep Fakes or other bad applications of AI. As Harvard CS professor James Mickens says: “If you are a PhD student studying Deep Fakes – drop out now. Your work will not make society better, it will only be used for revenge porn and disinformation“. And AI is today empowered with infinite applications, including some that perhaps shouldn’t exist such as in the US justice system.
The problem with the second approach to finding scientific question – believing in higher Truth and being guided by some external power – is that it is too easy to become untethered from reality. Not only does it create an opportunity for pseudo-science and general crankery, but it also creates unhealthy balance of power. How many time have you heard “You are working in science, you can’t do it for money!” or other appeal to passion for the question? When accepting existence of the higher power it is possible to also forget about human dignity and that we have to be serving ourselves first and foremost.
In conclusion, modern approach to picking scientific questions should combine reliance on existing literature and discussions of what is important, and also some filter to be able to highlight what has potential to benefit society, and what can be potentially harmful.
In graduate school we often are told to be self-sufficient, motivated, responsible for our own success. That mindset emphasize that source of our problem (and solutions) is internal.
Meanwhile, as we’ve discussed in Solving problem at the correct level you can’t solve external problems by working on the internal side of the equation.
It is your job to work hard, try to be smart, and put good effort into work. But there are many people around you – mainly mentors – who has the job of helping you out. If they don’t, then you can’t solve that problem by working hard. You have to fix the External, by changing mentors, or by managing expectations in a way that minimizes their impact.
Same approach comes to blaming and finding source of problem. Some people, whether in grad school or real world, will lay blame for their mishaps onto others, making it external: “If one X and Y did better! If only my mentor told me to apply for Z!” Other people “take responsibility” and carry the weight alone, by internalizing everything that happens: “I should’ve known better trust them!”
The truth, as often, lies between two approaches. I have somewhat moved from internalizing to externalizing issues of my PhD experience, and now getting closer to more balanced view.
After we figure out who is responsible for something, we need to understand how to control it and introduce change, which also can be external/internal.
Academics, including PIs, receive very limited management training. Common understanding is that new PI will get their skill from previous advisers, but academia should stop being apprenticeship-based.
IT world provides us with example of how to do such training: tabletop scenarios, such as @badthingsdaily.
This twitter account provides examples of IT incidents that can and do occur in practice. From very specific cases of “your network has been compromised” to “your CEO has been arrested in foreign country famous for kidnaping” and many more:
The goal of these exercises is not to try and come up with perfect “playbook” for when something bad happens. Academia is too heterogeneous for that. But it should start the conversation, and provide material for figuring out where are the weaknesses in the process. For academic world, specifically running a lab, “Bad things” include:
- international PhD student can’t get visa renewed and has been deported
- PhD student hit 7th year without a single first-author paper out
- global pandemic hit, and we have to shut down the lab for 2 months
- your paper has been found to contain image duplication in figures
- the experiment performed by your lab cannot be reproduced by trusted collaborator
- You were not able to secure funding for next year. You have budget for 6 months
- Project that was developed by PhD student just has been scooped and published by another group
- You (PI) has been diagnosed with clinical depression
- Your lab members want to know what you have been done to advance under-represented minorities (URM) in science and decrease systemic bias
- Your lab tech who plays all the orders and prepares reagents just quit with 2-week notice
Recently I’ve asked whether academia has conceptual frameworks for project (and general sciencing) management like software development has. One comment was that “science is more like a craft” and that extra bureaucracy is unnecessary. Some people brought up that science is apprenticeship-based activity, where next generations learn from elders.
Academic scientific process will greatly benefit from treating it as business projects. Yes, we might face a lot of uncertainty. Yes, we need to be free to explore. But even such art as cooking have come up with concepts, such as Salt, Fat, Acid, Heat or that “baking is a precise science“. There are a lot of concepts that cooks have adopted universally, without trying to pass cooking steak as some sort of magick. It is not easy, it requires practice, but it is still doable.
Similarly in science, of course there is huge component of luck, skill, experience, and serendipity. Meanwhile, there are good practices, that has to be openly adopted and discussed as “standard of practice”. However, discussion of these practices should have only one goal – making communication easier; not trying to standardize the science. Similar to A Pattern Language we need “A Science Language” that will bridge gap between new scientists and those who have worked in the field for years.
If you wish to contribute to the creating of that language, try to answer, in written form, what does “PhD student” mean to you, what does “PhD thesis” suppose to look like, how can we manage lab or finances, or what are the potential roles people can hold inside academic research (e.g. consultant, technician, research professor etc). Academic researchers need to discuss these terms and agree on some framework to think and communicate about it.
PhD process is extremely heterogeneous. As prospective PhD student you can barely figure out what to expect from talking to current and past members in the lab.
Let’s make this easier by using clear template for communicating what the PI expects from the student:
Authorships, together with citations, work as academic currency. This is how we know something is valuable: people in the community discuss it.
When it comes to authorships, however, things get trickier as we only have 3 categories for authorship of standard scientific papers. We distinguish First authors, Last authors, and “middle” authors. Assigning order of name to contribution is not trivial.
We can imagine treating paper as a company, and authorship as ownership structure. Each person will then own part of the company (paper), which should be made visible.
We all know papers where last author in that scheme would “own” 1% or even less. And we know papers where people who should’ve got 30% of the ownership are merely “acknowledged” at the end.
But academic papers are not companies, or products on a free market. We don’t have Securities and Exchange Commision to hold people accountable. It has to start within community. Accountability can be established by secret pre-registration of the paper. We often know that our work will result in paper, and a pre-print. Why not tell BioRxiv early on: “Hey, we are writing this. Authorship is split 4-ways as 25/25/30/20% between these authors”.
In case when autor A wants to bring a collaborator, they can negotiate with other stakeholders about the fraction of the paper that will be given to the collaborator. If somebody decides to quit project, their shares can be diluted among other authors. Splitting “shares” of the paper also allows us to remember that inviting more people to a project comes with a cost, but can be greatly beneficial in increasing value (just like with any investment).
While far from perfect (and probably impossible to implement) that scheme offers something of value already – a language to discuss paper authorship situations. For example, PI can state from the beginning: “This paper is not my responsibility, so the postdoc X will have 51% of shares” It makes it clear from the start who is really in charge.
There are a lot of problems with trying to treat papers as products or commodities. While knowledge is a commodity today, it is very hard to measure, break in pieces, and evaluate. Using monetary language, however, can be useful in managing writing and publication process.