Dr. Tammy Allen, Distinguished University Professor at the University of South Florida and Editor-in-Chief of Personnel Psychology, joins Richard and Tara to discuss what it takes to produce research that genuinely matters. She shares her framework for evaluating work through ideas, evidence, and impact — and why practical relevance should be built into a study from day one. The conversation also covers how the peer review process works at Personnel Psychology, the growing challenge of AI-generated reviews, the journal’s approach to open science, and advice for early-career researchers on curiosity, methods, and data management.
Key Takeaways:
- Impactful research starts with the question: will it lead to better understanding of work, workers, or organizations?
- Practical relevance isn’t an afterthought — it should shape how a study is designed from the start.
- Personnel Psychology has built a practitioner advisory board to keep science connected to real workplace problems.
- Policy change requires an accumulation of evidence, not a single compelling study.
- AI-generated peer reviews are a real concern, and the field currently lacks reliable detection tools.
- Personnel Psychology encourages open science practices but does not mandate data sharing, given that organizational data is often proprietary.
- Mentorship, relationship-building, and exposure to diverse faculty and methods are essential to a strong research career.
- Students should load up on methods courses and develop strong data management habits early.
- Special issues and editorial guidelines are powerful signals for shifting what the field values.
- Shifting academic incentives takes time — even a five-year editorial term is short for moving entrenched norms.
Website: https://thegig.online/
Follow us on LinkedIn: https://www.linkedin.com/company/great-io/
Join our Discord here: https://discord.gg/JcTcMu335K
Join The GIG Email List: https://docs.google.com/forms/d/e/1FAIpQLSfVQ4hyF8MA4G9W-ERwVL8_e91a-MUMuhNvxhXmgkSFUDFatg/viewform?embedded=true%22
00:00 Welcome & episode intro
00:25 Introducing Dr. Tammy Allen
00:52 Making research practically relevant
01:20 Ideas, evidence, and impact framework
02:19 Real-world experience and research questions
02:46 Practitioner advisory board at Personnel Psychology
03:41 Tammy’s favorite impactful paper
04:09 The 2015 telecommuting review
05:36 Communicating across disciplines
06:02 Training researchers to translate science
07:24 Building policy through accumulated evidence
08:53 What “novel” really means in research
09:22 How to publish in Personnel Psychology
09:49 Submission and desk rejection process
10:48 Role of action editors and reviewers
12:35 Two reviewers vs. three
13:32 AI-generated peer reviews
15:31 Open science at the journal
16:01 Why data sharing isn’t mandated
17:29 Tammy’s mentors and early career
18:25 Being a first-gen college student
19:24 Relationships built in grad school
20:23 Coursework and faculty exposure
22:19 Advice for early-career researchers
23:18 Stay curious, read widely
23:48 Methods, data management tips
26:37 Shared data repositories
27:05 A recent paper that made an impression
27:32 Food insecurity and job performance
28:31 Tammy’s role in shaping the field
29:30 Creating change through alignment
31:54 Breaking free from paper templates
33:22 Special issues as signals
34:38 Closing thoughts
35:30 Outro and links
Transcript
00:00 Welcome to the Great IO Get Together. On tonight's show, quips and queries about the world of work as IO Psychology comes alive. Now please welcome our hosts, Richard and Tara. Welcome everyone to Great IO Get Together at number 40. My name is Richard. This is my co-host Tara. Today we are exploring chapter 14 of our textbook, Research Methods for IO Psychology. And this chapter is all about developing and sharing the results of research with others. 00:25 So to help us share our research meaningfully on the show today, we have Dr. Tammy Allen, distinguished university professor of psychology, the university of South Florida. Welcome to the show. Thank you. I'm so delighted to be with both of you. So I really love the idea of focusing on practical relevance. And it's something that I think a lot of people aspire to, but they didn't know that the field appreciated that kind of research. Do you have any advice for authors about how they can 00:52 conceptualize and execute the research in a way that's more likely to lead to usefulness? I think it all starts with the question. um Will this question address an issue that leads to a better understanding of phenomena at work, how it might meaningfully change, how work is designed, managed, or experienced? And in the inaugural editorial that was published, we tried to parse this out into three 01:20 different aspects of research. One is the ideas, problems that matter for work, workers, organizations, the evidence, do the methods enable that question to be soundly answered? And then what impact will the research have? What decisions, practices, or policies might be informed or changed by the research? So I think when it's clear about what might actually happen, that 01:50 usefulness really follows more naturally when you attack the research from the start, uh thinking about ideas, evidence, and impact. always laugh because I think a lot of people want to develop to oh make the world better, but they don't have a lot of life experience. And so the kinds of problems they focus on are pretty narrow. It's like the kinds of problems somebody who was raised out of wood and maybe over for dominance is something useful. 02:19 But do you think that also applies to our world? Like, do you have to get out into the world to know what kinds of problems might be worth solving? Or is this something that you can figure out by reading the New York Times? What's your sense of like the youthfulness or importance of those real world experiences? Yeah, I think it's really important to read widely. And I mean, that may include the 02:46 the New York Times read widely, talk to a lot of different people. If we're stuck in the lab in front of our computer, we're not able to see the sorts of things going on around us like you mentioned, Tara. You one of the other things that we're doing at the journal is we've developed an advisory board of really outstanding practitioners. So this was done as a way to try and generate some of those conversations. 03:13 another way to try and connect the science community with the practice community so that we can be generating the types of questions and doing the types of research that's really mattering to people in the trenches, so to speak, today. I really love that you have that advisory board. It's just a mirror to whether the things that we think are important in new school really are. I think it's really great. So your homework as a researcher. 03:41 has been really influential, both for organizations, but also for policymakers. Do you have a favorite paper that you think has been influential in a way that you are really proud of or fond of, or maybe a favorite example of how your work has been applied? My answer to that question would probably vary from day to day, given some of my current work on remote and hybrid work uh issues. 04:09 I published an article in Psychological Science and the Public Interest in 2015. It was a big review of telecommuting, back when remote and hybrid work arrangements were referred to as telecommuting. um And it was published with one of my research besties, Kristin Shockley, and with Tim Golden. And that journal is uh intended to publish work that can't inform policy. 04:39 And so the process of doing that review and reading very widely across a variety of fields so that we weren't just speaking to a psychologist really informed the way I think about doing research that can inform policy. it's been very widely cited, which is gratifying, but isn't what really matters. 05:07 It's the different outlets that are citing that work from public policymakers, researchers in a wide variety of fields, as well as research and consulting firms. I'm happy that many of the conclusions that we came to in that article are making a difference for how companies are thinking about the application of telecommuting and remote work today. That's such a good example. And I think you're right that the ultimate 05:36 mark of uh impact is that it leaves the bubble of um psychology and gets picked up by people in political science or economics as an important idea. And it seems to me like psychologists don't get a lot of training in how to engage with those other disciplines. Like talking to an economist feels like a really hard challenge because they maybe don't have fluency with some of those ideas. 06:02 Is that is that just an inevitable part of PhD training that you have to get really narrow or other things we could do better to to better? I mean, based on your experiences, like what are things that people could do to better communicate with those audiences? Yeah, that's something we've actually been trying to do with the journal and with our graduate students is give them practice in translating our science for 06:31 people outside of our science. We're involved more in communicating via social media, articles that have been published in personnel psychology. In my graduate course, one of my assignments for students is to take a body of work or an article and translate it in a way that uh it can be understood and picked up by. 06:59 others outside of IO psychology. We're communication as a two-way street is really important here too, right? So we're communicating outwards. We also need to be good listeners and keeping track of what other disciplines are talking about. It might feel like a huge challenge just given how complicated the world is and how many voices there are, but I think that's an important 07:24 of translation to, right, is hearing what other people think is important and how they're using these ideas and how we can potentially build and say, well, the psychologist can offer the mechanisms or we can offer insight into people's perceptions and fit it into where the conversation already is. When designing a research project, that research project can be impactful, but it's really hard for any 07:54 one study to make a difference for policy. And for policy, you really need to think about the accumulation of evidence. And so that's where each study can be a building block to informing policy. And that's where I think we get a little uh too caught up in the idea that this notion of what's incremental science uh versus something that's novel and 08:24 ah changes, conversations. We need these building blocks so that we can accumulate the evidence in a way that is programmatic and can inform policy. We don't want to make big decisions typically based on the findings from a single study. We need a body of evidence. That idea is one that gets distorted a lot. What does it mean to be novel? What does it mean to make 08:53 make a contribution, people really misunderstand with our themes. And I think on the policy side, we also have to be good coaches and discourage people from getting too excited about one paper and say, let's think about this paper in the context of everything that we know and what does a reliable body of evidence look like? mean, that's not something that most um policy aides are really thinking about unless we help them do that. 09:22 So let's talk about the journal some more. Maybe for beginners, brand new beginners, walk us through the process of publishing a PSYCH article from beginning to end. So what happens, you know, I go into the journal system and I upload my paper and I hit submit. What happens to it after that? Who are the people involved? What are their roles? How are they chosen? What are their powers? Walk us through. After you hit that submit button, 09:49 There's uh an administrative process in the sense that a managing editor will go through the manuscript to make sure that the formatting is the way it should be uh before there's this pre-processing, if you will, and then it gets sent on to me. And what I do then is review each submitted article for scope and whether it fits the mission of the journal. 10:18 And if it doesn't, it's desk rejected. And we think that's a kind way to deal with that research rather than having it go through the review process and not get a decision until months later. If it fits the mission scope, and I pass it on to one of our AEs, and I do that, 10:48 thoughtfully in the sense that I identify the AE who seems to have the expertise most closest to the topic of the article submitted. And then the AEs determine, they also have the uh power, if you will, to test reject an article. um They may be closer to that topic. It may feel like it doesn't uh necessarily fit the mission of the journal. 11:16 um Otherwise, they will go ahead and send it out to reviewers. And as a team, we've also had conversations about how important it is to match reviewers carefully as well. um We want to make sure we get the best reviewers, the reviewers that have the greatest potential to love that article that has been submitted. 11:40 and then the reviewers uh review and submit their reviews and then the action editor makes a decision. uh And it's important to note that our action editors are not review counters. They review our counters. They have the uh autonomy to uh make a decision that may be counter to the recommendations of the reviewers. 12:10 But they use the reviewer information to make that decision, which then gets passed along to the authors. And if there's a revision that's requested, we go through the process again. I'm surprised at one, the beliefs that people have about how the process works and how they might be influenced beliefs sometimes. 12:35 You can see what there's variations from journal to journal and you really do need to discover is the editor in chief handling every paper or the ad hoc reviewers, members of the editorial board. These are things that I think do differ from place to place and it's helpful I think to have a picture of the whole scene. And we use two reviewers, right? One of those points of variation is the common use of three reviewers versus two reviewers. 13:05 I prefer two reviewers to three reviewers for making decisions even. don't find, generally speaking, that three reviewers helps inform decision making. And as an author, do you want to respond to two reviewers or to three reviewers? We've all had 100 pages of comments to respond to, and nobody wants that. em 13:32 Have you run into, you don't have to answer this if you don't want to, but have you run into any problems with reviewers using AI to complete their reviews? And if so, are you willing to? We don't have ways to detect whether or not that is the case. um The policy is such that articles should not be submitted to AI, that that's the intellectual property of the authors. um So we have, that is the policy. 14:02 publishers policy of the journal. And I think that's the case for most publishers that I'm aware of. Unless there are new ways to detect that the two of you may be aware of that I'm not, um we can be suspicious and that may have occurred, but we don't have a way to detect it. Yeah, it's something to prove, but do you feel that there are indicators that maybe the 14:32 review was not fully human-created. I wonder what that does to the field and the accountability of the field for timing. Certainly computer science is out of crisis because of this issue. And it would be, I think, an even more serious problem if our field ran into the same situation. So it's definitely something of a phenomenon. I have no magic solutions, unfortunately. I mean, I can say as an author, I've received reviews where I thought, oh gosh, this is 15:02 this was AI generated. uh And it's hard to pinpoint exactly what it is that gives you that sense. And the issue is not only with reviews, but with papers that are submitted as well. I agree with you. I have a very strong sense of it, but couldn't tell you why in terms of a checklist, which makes it harder to deal with. 15:31 Maybe you can tell us, what are the conversations like at the journal about open science? We are of the mind that we want to, we encourage open science practices. It's important for the credibility of our field. It's an important aspect of the stewardship of our science. But we're not mandating specific open science practices. 16:01 And a big reason behind that is that we really want to uh attract work that involves data from organizations and that type of data may be proprietary or there may be aspects of that data that can't be shared. uh So we really want to encourage all the open science practices where possible, but don't feel comfortable mandating something like 16:30 that along the lines of sharing the data. Now sharing code, sharing study materials, all of that is strongly, strongly encouraged. It would certainly discourage people if they were hard to share the data. I do wonder about the various differential privacy tools where you're generating a data set that can't actually like your data but isn't your data and how organizations would feel about it. 16:59 But I think that's often, mean, there are often so many layers of caution and ordering and paranoia that organizations have that that would still be really difficult to navigate. Members of team had some specific examples and uh we didn't feel comfortable uh issuing a mandate that uh not everyone could adhere to. Okay, let me shift to 17:29 talking about your own journey a little bit. When you think about your own mentors and the people who have shaped the way that you approach research, who are some of those people and maybe some of the lessons that you've taken from those experiences? I would say I really started with my own work experiences. And before I started graduate school, I worked for a period of time at an insurance company. 17:58 And I some really great bosses and I had some not so great bosses. And part of that really got me interested in IO psychology as well. And I was really fortunate to have a fantastic IO psychology professor, Ron Riggio, who was at Cal State Fullerton at the time. And I started working in his lab and 18:25 I was a first generation college student. had no idea what graduate school was and Ron opened those doors, so to speak, and I fell in love with research. and it's funny because I've told him this and I'm just one student in, know, thousands and thousands of students um and how you can really make a difference in one person's life when you don't even really. 18:55 it just by being a uh caring professor that sees something in someone and helps encourage them. And I think that also inspired some of my research on mentoring as well. um And you know, I lots of great mentors in graduate school. had a wonderful experience in graduate school. I had met my husband in graduate school. I met my research bestie, best friend for life, Lillian Eby in graduate school. 19:24 I tell students to really em appreciate the people and the relationships that you develop in graduate school as well. And so much of the work that we do is through these relationships that we develop. um And so not only just in the sense that they may be your research collaborators, but friends and people that you can 19:53 call upon throughout your career as well. The other thing I will say is that I really believe in coursework. That may not be a popular take, but we took a fair number of courses at uh University of Tennessee Rest in Peace uh I.O. program, but uh that really provided exposure to all the different faculty. 20:23 So it's not necessary for people to develop research studies with all the different faculty to learn from them. You can learn so much about their different approaches to teaching in different areas of study. um So I embrace that model of uh having courses and learning from the various faculty through those courses and developing research ideas through those courses as well. 20:54 It's incredibly, when it has the world's authority on mentoring, think your perspective here is very valuable. oh I really think the point about just connecting with as many kinds of people as you can is so important because you do take something different from each of those relationships and it shapes the kind of work or the trajectory that you might find yourself on. And I always feel when I see someone who's 21:22 just only worked in one lab or only one context with only one set of co-authors, that they're missing out on something really important that would take their thinking in a different direction. And certainly my own career is shaped by those experiences and the people I met that were working on an interesting problem I didn't know anything about or who were using a set of methods that I thought were really exciting. And you do sort of take a little bit from each of those. 21:50 The other point that you made that I really like was about the power that we all have, I think, as people who meet lots and lots of students to encourage them or potentially discourage them if we're careless, right? And if you're not thinking about those interactions, can be really, it can be really impactful for people. It's funny how many people go into biopsychology because they had a bad boss. I hear that a lot. Oh gosh. 22:19 This was a bad boss. that isn't the way we want people to make decisions necessarily about what kinds of futures would be good for them either. While we're on the topic of advice for students, do you have any other advice for students as they embark on their research journey? So definitely the points you're making about relationship building and thinking about impact are really valuable. Are there other things you see students kind of missing out on or? 22:48 or things that are especially important to do early in their careers? Early in your career, throughout your career, stay curious. um Read what's kind of goes back to what we were talking about a little earlier, that is read what is being published in the journals, but also stay on top of what is happening in the world. um And if we want to do research that is impactful, that makes a difference to the world, we have to really 23:18 stay connected and get outside our bubble. I like to read comments published in response to a news article, not just the article itself. I learned so much just by seeing how, by seeing unfiltered comments that are made in response to different topics. And then share your ideas with others, have conversations, not with just your professors, but with fellow students as well. And 23:48 to, right? And regular people. The second is load up on methods. I think just about everyone would give that advice. You can't take too many methods and stack classes while in grad school. It's really, really important to have a robust methods toolkit. And the third thing I would say is data management. Really need to be organized. You need to always be thinking about 24:15 ways to make sure that the methods and the data processing are crystal clear, safely stored, um that others have access, be thinking about this well from a transparency and open science perspective. And my gosh, back up that data in multiple places. um We have recently had a case in which some data was lost to the cloud. oh 24:43 It's really important to develop those skills early on. we know as you continue in your career, you have more and more data and more projects and developing a system to really track all of that in a careful way. It's much easier to start with a good system than to try to look backwards and organize your files and files of mass later, which I had built some of that. 25:11 infrastructure from the beginning. And Richard and I really put a lot of effort into talking about the sort of like logistical things like file naming, because it's the sort of thing that you don't think is important until it is and then you're really sorry. We all had to learn less in the hard way. Yeah, it's, yes, I can do this fully. And then you work with people who have different file naming. 25:39 How many papers can you have that say paper? Draft two. I've got a million. I'm sure you get this too, but I often get requests from people for data or measures of things from very old papers and saying it's on my other computer is a bit of a mess. So keeping track of those things in a way that persists over time is even... 26:09 is even more challenging. know that the NSF has been struggling with this a lot. Just how can we create persistent data repositories? know, OSF is a nonprofit. It could benefit. Your own website is not really very robust or sturdy. so this is not a solved problem in the field, but anything that we can do as individuals is really going to make it slightly better and easier. 26:37 Yeah, and I think if we all had really robust data management infrastructure, we could create these shared repositories and open up the data in a way in which it could be beneficial to more researchers and based on stronger samples as well. We want to ask you one last question, which is, is there a paper that you've read in the last year? 27:05 that really made an impression on you or that you're really excited about? Maybe you can tell us about that paper. So of course, I had to pick a P-Psych paper for this. And this was published by Francisco Marino, who's at MSU, and who's someone who I do not know, so there is no bias here. But the paper is working on an empty stomach. 27:32 how food insecurity impacts job performance through rumination. And I just love this because it's tackling a really important societal issue. 81 million people globally, 34 million people in the US alone suffer from food insecurity. Not something that I had seen tackled in our journals. In conducting the study, they draw on theory, but they also have some really meaningful practical 28:02 implications and the roles that organizations can take in addressing the issue. So I thought it was a great example of the type of research I'd like to see more of published in the journal. I'm very ready to consider some of these issues. It come from focus on a relatively narrow set of occupations. So it's really great to see the journals embracing that. It gives me a lot of hope. How do you see what you're doing in relation to the broader 28:31 field. So it's something that I've struggled with a bit uh as a journal editor, in that you have uh a very hands on control in terms of shaping what kind of research uh is being published in your journal, but at the same time, there's this like, uh more ambiguous touch that you have associated with the surrounding sort of research infrastructure. Do you feel like your work with the journal is like, 29:01 Is this a voice in the darkness? Is this something that you feel like it's a rising tide lists us all or I don't know. How do you view your relationship um with the future of biopsychology? Oh, wow. Yeah, no, I totally understand what you're saying, Richard. um And I think it's a 29:30 It's a heavy lift to try and create any degree of change with our journals. Um, so I think in the case of personnel psychology, as I mentioned, we're just, we're trying to create a shift. way that I've been trying to make that happen is through alignment, like really bringing the whole editorial team in alignment on our mission. And then. 29:58 trying to bring the editorial board into alignment too, because if the reviewers are still using another standard to review papers, we're not going to be able to do what it is that we want to do. So we're getting ready to issue another set of guidelines that are intended for our authors and our reviewers. um And I'm very cognizant of the fact that I have three years at the journal. 30:27 that's not very long, especially when you get to this stage of the career, you realize how quickly three years can go by. So how can you create a shift that begins with the research that's conducted and conducting the research takes quite a while and then submitting it and getting it through the whole revision process. I, you know, I'm optimistic that 30:56 we can execute this shift, but I don't think that I'm naive to the challenge in doing so. And I don't know what you're seeing at your journal, but I am seeing a lot of research that appears to be kind of based on a template. And I would really like to get rid of the templates as I issue a set of guidelines for reviewers. 31:26 um And I think that's part of what we get stuck in these paradigms and it's hard to shake that off and it takes time to get any sort of shift. someone today gets the message and starts a research project, they might not get it to you in time and that's tough with only a three-year term. didn't think it was five. That's really short. em 31:54 Yeah, as far as the template thing, I mean, this is the consequence of people responding to incentives, right? Like if you write a paper that looks like a certain way, historically, everybody thought that was a great paper, whether it was or not. That's incredibly hard to shift unless you can move all the incentives around. I mean, as a journal editor, you do have the power to shift those incentives, but unless everyone else also does, like your kind of intention with those other forces. 32:24 So that inspirational piece is a really big one. So my term is five years. I have a little more time. But m even that seems, seems quite short for trying to deal with some of these longer term patterns that we see in terms of what people want to want. Well, what people think is what journals want. That's that incentive space. I often think about how, how the journal can and should and what power I even really have to influence like, 32:54 the new generations like does this become an outlet that they would be able to use for tenure is kind of a weird question to ask yourself. But that's part of it. It's like actually setting up the incentive system so that people will it in the long term, uh whether they can whether we can enact short term change or not that in the long term, it can at least change some minds. Yeah, but yeah, it seems like a uh pretty ambiguous hope sometimes. 33:22 But I also think that, right, the more we send out, we're doing the special issue that Tara is co-editing. And to me, we want that research, but it's also a way to, another way to send out the signal of what type of research that we think should be conducted and what we would like to see published in the journal. So the more... 33:49 the more we can try and send out these different signals, ah the greater the likelihood that the larger community will pick up on what we're trying to do and get on board with it. Do you think that special issues have a lot of power in sending signals? Like the various special issues that are implemented results mask. 34:16 submissions, I think we're really powerful in having some journals adopt that as a standard submission tape. certainly getting the idea into people's heads and getting people talking about it has an impact. And if anyone is going to fix things, it's going to be you. So we're putting all our hopes on you, Tammy, to take the field forward. 34:38 Thanks so much for doing it. I know that the two of you have been talking about these issues for a long time too. So it's not just any one voice, all of us together that can try and create this change that I think so many people think is needed. And if we all just keep trying to pull the levers, I think we'll get there. I think so too. 35:06 Well, thanks again for your insight and wisdom and for everything you're doing with the journal. We're really just super excited. And we're very honored to have you in the gig to talk about your vision. Thanks for joining. Thanks for having me. And thanks for doing the gig. That's another way that we're using our voices. That's it for another gig. 35:30 To stay in touch, subscribe on YouTube, check out our website at thegig.online, join our LinkedIn group, sign up for our email notification list, and join our Discord. Thanks for joining us and see you next time for another great IO get-together.
