• Home
  • The Point Blog


Welcome to the The Point, PTCMW's Blog.

The blog is intended to be a space where we can share our latest research and applied approaches to highlight the passionate individuals and teams in our field.

As members, please feel free to post your comments and thoughts in relation to each article.

<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 
  • 06/19/2024 9:40 AM | Anonymous

    Author: Jackie Martin Kowal, PDRI by Pearson

    In today's dynamic professional landscape, keeping up (much less staying ahead) often means mastering new tools and technologies. I/O psychologists on the job market are familiar with the impact of listing expertise with new tools and technologies on their resumes. Research from the World Economic Forum suggests that by 2025, over half of all employees will require significant reskilling or upskilling due to technological advancements and industry changes. It is imperative that I/Os take the initiative to learn novel tools and technologies to stay competitive in today’s workforce. 

    However, there are several barriers to keeping up with learning novel tools and technologies. In particular, if there is not clear structure or support for learning new tools and technologies through formal education or on the job, mastering the new tool or technology would require learning independently. This involves dedicating time (and sometimes money) outside other commitments. Furthermore, for popular tools and technologies, the overwhelming amount of information available can be daunting, especially in those fields outside of traditional I/O topics.

    This article will explore the art of self-learning by providing a roadmap for mastering new tools and technology. I will provide personal experience and examples related to mastering Tableau, a tool for which I am self-taught. The strategies outlined here can be applied to learning any new tool or technology (e.g., RMarkdown, ChatGPT) or, more broadly, any new skill you are interested in (e.g., machine learning techniques).

    Find Free Resources

    For most tools, it will be easy to find free resources and the tougher part will be deciding how to narrow down the available resources into something manageable. My recommendation in this case is to focus on resources that are applied specifically to I/O psychology or Human Resources/Talent Management, since these will be the most relevant to your work. For Tableau, I googled “Tableau I-O Psychology”, which led me to Richard Landers’ training available on SIOP’s website. 

    Additionally (or if I-O or HR specific training does not yet exist), I recommend checking out resources available from the tool itself. Tableau has many free training videos available. You may also consider self-paced online learning tools like Coursera, DataCamp, or LinkedIn Learning. Sometimes these tools offer discounts or free trials that you can leverage.

    It can be challenging to find the most relevant and high quality training materials. To address this, I adopted a trial-and-error approach, dedicating initial time to sample a variety of resources before committing to one. This allowed me to gauge the relevance, comprehensiveness, and effectiveness of each resource in addressing my learning objectives. I paid close attention to the clarity of instruction, relevance of examples to real-world scenarios, and the depth of coverage on essential topics before choosing the best training resources to pursue.

    Identify Resources Available Through Your Employer

    Check with your workplace about available resources for learning the tool you’re interested in. Does your employer already have licenses available for the software? Is there training through work available? Can your supervisor provide dedicated professional development time for you to use work time to learn?

    When I was learning Tableau, my workplace was able to provide me a Tableau license and an allotment of professional development hours. Additionally, their license to Tableau included Tableau training courses that I could take. It is important to advocate for yourself in this space. Most companies recognize the importance of providing opportunities for professional development for both job performance and employee retention. You may also consider enlisting the support of coworkers who are also interested in picking up these skills. A stronger argument for resources can be made to the company if there is a large group of employees interested in learning that skill.

    Make A Plan

    Once you've identified the resources you want to use and you’re ready to embark on your self-learning journey, tracking your progress and setting clear goals becomes crucial for maintaining motivation. Here's how you can effectively monitor your advancement and establish achievable goals:

    • Set SMART goals: Ensure that your goals are Specific, Measurable, Achievable, Relevant, and Time-bound (SMART). For instance, instead of setting a vague goal like "become proficient in Tableau," aim for a SMART goal such as "create three interactive dashboards in Tableau within the next three months."
    • Use progress trackers: Implement tools or methods to monitor your progress systematically. This could involve using spreadsheets, task management apps (e.g., Microsoft’s To Do, Wunderlist), or specialized learning platforms (e.g., Coursera) that allow you to track completed modules, projects, or skill assessments.
    • Keep running notes: Maintain running notes to document your experiences, insights, and challenges encountered during your self-learning journey. Reflecting on your progress regularly can help you identify areas for improvement. This will also be useful for you to reference to keep track of resources and remind yourself of key skills later on.
    • Celebrate milestones: Acknowledge and celebrate each milestone you achieve along the way. Whether it's completing a challenging project, earning a certification, or mastering a new technique, taking time to recognize your accomplishments can fuel your motivation to continue learning.

    Apply Your New Skills

    Hands-on practice and experimentation are essential to mastering a new tool. Find a dataset or project that interests you that will give you the opportunity to practice your new skills. Below are a few recommendations for how to find such a project. These recommendations are listed in a progression from low stakes to high stakes applications. Working your way up from freely available datasets to a work-related project is recommended.

    • Leverage freely available datasets: There are many sites where you can find freely available datasets. Tableau offers some here and Kaggle is also a great resource for datasets across a broad range of applications and fields.
    • Volunteer your new skills: Ask a friend or family member if you can apply your time and new skills to a project they have or see if there is a way you can leverage your new skills. When I was learning Tableau, I volunteered to help members of SIOP create a membership dashboard  (SIOP login required) for their website. This was great practice using a large dataset that needed a lot of reformatting and cleaning prior to visualization in Tableau. It also provided the opportunity to practice implementing specific user requests. 
    • Find a work-related project: Think through your current work projects and identify one that you could use your new skills on. Make a case to the project manager for that project about how this work could benefit the project. Be sure to consider the additional time required to learn and apply the new skill when building the project plan and timeline. When I was learning Tableau, I was working on a project where we were conducting a comprehensive literature review of military research to identify the top predictors of success in leadership roles. I thought using Tableau to create a visualization to summarize the research would be helpful since we were looking at hundreds of validity coefficients across dozens of constructs. Tableau allowed us to visualize this information in one place and to filter the constructs and articles on demand.

    Mastering new tools and technologies independently is not just a professional advantage; it's a necessity in today's rapidly evolving job market. Remember, learning is not a one-time event but a lifelong process, and embracing this mindset can lead to opportunities for growth and development. Though learning new tools and technologies can be daunting, I hope the strategies listed in this blog will help you get started!

    Have you learned a new tool recently? Please share your experience and any resources by leaving a comment below!

    Author Bio: Jackie Martin Kowal is a Senior Consultant and Team Lead at PDRI by Pearson with experience in talent management and consulting spanning the private sector, government agencies, and military organizations.  Her experience includes work in the areas of job analysis, competency modeling, selection assessment development and validation, performance measurement, program evaluation, training development and evaluation, training needs assessment, and engagement survey development and analysis. Jackie received her Ph.D. in Industrial Organizational Psychology from the University of South Florida.

  • 03/31/2024 10:27 AM | Anonymous

    Author: David Swiderski, PTCMW Blog Editor

    Today's post is a brief introduction to this year's PTCMW board members.  These individuals bring a wealth of expertise and enthusiasm to further PTCMW's mission.  Each board member was asked to say a bit about themselves and why they chose to serve on PTCMW's board.  Please extend a warm welcome to each board member and I look forward to their contributions to shaping the future of I/O psychology in the DC metro area and beyond!

    President - Jennifer Flaig, Emergency Services Consulting International (ESCI)

    Hello, I'm Jennifer Flaig, and I am honored to serve as the President of PTCMW for 2024. I currently serve as the Director of the Human Capital Division for Emergency Services Consulting (ESCI). In my role, I develop and direct testing and training programs for public safety departments throughout the United States. My journey into this leadership role for PTCMW was driven by a passion for I/O psychology and a commitment to create a space where professionals in I/O can come together, share knowledge, and collectively address the challenges we face, because, let’s be honest, we have challenging jobs! Having spent over 20 years in the field of assessment and selection, I have witnessed the incredible potential for growth and collaboration within our field. My vision for this year is to continue to build connections, foster mentorship opportunities, and create an environment where every member feels supported in their professional journey. Thank you for entrusting me with this opportunity to lead, and I'm excited to work with our incredible Board to bring meaningful experiences and networking opportunities for the PTCMW community!

    Past President - Phil Walmsley,  U.S. Office of Personnel Management

    I am a Lead Personnel Research Psychologist in the U.S. Office of Personnel Management’s (OPM) Assessment and Evaluation section. We’re a team of federal employees assembled to assist with measurement initiatives focused on the employees, applicants, and customers of agencies across the U.S. Government. I conduct job analyses, develop assessment/measurement strategies, validate the use of a variety of pre-employment and leadership assessments, deliver trainings, and conduct large-scale data analyses. I serve as an advisor on the design of software as a service technology systems used governmentwide for talent management and assessment. I’ve also participated in personnel selection working groups held by several agencies. I’m a member of the editorial board of the International Journal of Selection and Assessment, received the Society for Industrial-Organizational Psychology’s Distinguished Early Career Contributions Award, and served as the 2023 President of the Personnel Testing Council of Metropolitan Washington. I received a M.S. in Industrial-Organizational Psychology from Missouri State University and Ph.D. in Industrial-Organizational Psychology from the University of Minnesota.

    I recall attending my first PTCMW events in 2007, and in particular that year’s special event featuring Bob Guion. Ever since, I have found the twin aspects of professional education and networking to be of incredible value, and all within a setting of our community’s welcoming and collegial atmosphere. I joined the PTCMW Board in hopes of continuing that collaborative and expertise-enhancing spirit. It has been very rewarding to work with everyone involved in recent years’ Boards, and I look forward to seeing you at a future PTCMW event!

    President Elect - Michael Heil, JPMorgan Chase

    After several years of attending powerful events as a member, I wanted to serve on the board so I could more fully help support PTCMW and our professional community. I’m very excited that I now have the opportunity to do so as President-Elect.

    I have close to 30 years of experience developing and validating skill-based assessments as both an internal and external consultant in both the public and private sectors. I’m currently employed by JPMorgan Chase, where I’m responsible for supporting and leading assessment projects related to the design, development, implementation, and on-going monitoring of talent assessments for employee selection and development. I’m also an Adjunct Professor at American University, where I’ve taught I/O psychology for over 22 years. I also continue to serve as a member of the University of Maryland Baltimore County (UMBC) I/O Psychology Advisory Board.

    I graduated from Kansas State University in 1997 with my Ph.D. in I/O psychology.

    Vice President - Christopher Whiting, University of Maryland

    Hello, my name is Christopher Whiting, and I am a Management Consultant for one of the largest technology consulting organizations where I develop people and lead the organization through change. I have a passion for people and their professional and personal development. I chose to serve on the PTCMW board because of the organization’s commitment to continuing the development of I/O professionals and students. I have benefited from the programming and want to give back by furthering the mission of PTCMW. Outside of my professional career, I give my time to efforts that uplift and improve the lives of those in disadvantaged communities. I enjoy creating new experiences through travel, museum visits, and creating digital art. I am a graduate of University of Houston and on track to receive my M.P.S. in I/O Psychology from the University of Maryland in May 2024. I am looking forward to a wonderful year with PTCMW.

    Secretary - Tori Bakeman, HumRRO

    I was born and raised in Oklahoma and attended the University of Oklahoma for undergrad where I was a research assistant in an I/O Psychology lab. Upon graduating, I worked in HR for about two years and interned in Professional Learning and Leadership Development while obtaining my Master’s in I/O Psychology from Baruch College. While I initially thought that I was more interested in pursuing the “O” side of the field, it was through my coursework that I realized my love for selection/assessment and all things data. I moved from NYC to the DMV about one week after graduating to join the Human Resources Research Organization (HumRRO) where I am currently a Research Scientist. At HumRRO, my work primarily involves assessment development for government and private sector clients. My day-to-day activities include project management, analytics, and facilitation of SME workshops. I have been a member of PTCMW since relocating to the DMV and have found that the organization’s events are fun opportunities to connect with like-minded individuals. I wanted to join the board to be more involved in the organization and the broader I/O community. I am excited to continue meeting and learning from others in our field and look forward to serving as Secretary!

    Treasurer - Peter Butucel, U.S. Navy

    Hello, PTCMW members! I am Peter Butucel, and I currently serve as your Treasurer. I joined PTCMW in 2021 to expand my professional network and to grow my expertise in the I/O field. Soon after joining the organization, I answered a call for volunteers and started serving in various roles, starting from supporting the Vice President for Programs with the Fall Event, as Secretary in 2022, and now as Treasurer for 2023-2024.

    In my professional role, I am a senior advisor to the Chief of Navy Chaplains at the Pentagon. My responsibilities include advising Navy admirals on workforce staffing, professional development, and policy. To that end, I assist with the development and execution of a corporate strategy to develop competent leaders of character with the personal connections to lead effectively and the knowledge to create, build, and sustain the spiritual readiness of warfighters in both the Navy and Marine Corps.

    I hold a Master of Science degree in Industrial-Organizational Psychology from Liberty University. In 2021, I was selected by the Navy to participate in the Secretary of the Navy Tours with Industry program as a Military Fellow. During my tenure, I worked in the Learning & Development department at Hilton corporate in McLean, VA, where I was part of a small team that developed the company's strategic plan for lifelong learning. As a result, Hilton became the first in the industry to offer robust educational opportunities at no cost to their team members.

    When I am not working, I love to travel, explore new places around the DMV with my family, and SCUBA dive.

    Recorder - Kenzie Hurley, PDRI by Pearson

    Hi! My name is Kenzie Hurley and I'm the 2024 Recorder for PTCMW. I chose to join the board because, as a recent graduate, I found that this would be a fantastic opportunity to network with people in my field and make connections in the DMV area. Outside of PTCMW, I work full-time as a Consultant at Personnel Decisions Research Institutes (PDRI). My work largely involves assessment-related research for a variety of clients in both private and public sectors. I received my Doctorate in Industrial-Organizational (I/O) psychology from Clemson University in 2023 and my Master's in I/O psychology from the University of West Florida in 2021.

  • 11/18/2023 5:34 PM | Anonymous

    Author: David Swiderski, PTCMW Blog Editor

    The 2023 PTCMW Fall Event was headlined by Ann Marie Ryan, a leading scholar in the field of personnel selection.  This blog is intended to provide some quick takeaways to spread Dr. Ryan’s message on the influence of technology on accessibility and hiring.  Dr. Ryan’s talk focused on three dichotomies that exist when considering how technology has changed the hiring landscape for applicants and hiring managers with implications for researchers who study these concepts and those practicing in this area.


    Takeaway 1: Technology can expand and decrease access for individuals

    On the positive side, this takeaway focused on the ability of technology to bring assessments to individuals who would have never had the opportunity to otherwise apply for and be evaluated for a job they are interested in.  The digital divide is a well-known phenomenon that focuses on differences in access to technology and the internet across different groups.  For example, a survey by the Pew Research Center found that 80% of White adults reported owning a desktop or laptop computer while only 69% of Black adults and 67% of Hispanic adults reported owning either of those technologies.  With the rise in mobile assessments and unproctored environments, assessments can be taken by a wider group of individuals than ever before.

    Dr. Ryan cautioned that as we continue to adopt new technologies, to be aware of the “hidden barriers” that we may be creating for some groups.  As an example of a hidden barrier, the speaker cited that some companies are using artificial intelligence to transcribe what individuals are saying in an asynchronous interview and using that to score them on competencies via an algorithm.  Although this may be beneficial for some groups, one group that is potentially negatively impacted by this is those who stutter.  Dr. Ryan presented some of her work with HeardAI, which aims to improve voice recognition technology for those who stutter.  By improving the way that this technology works, this will not only be beneficial for the estimated 80 million individuals who stutter, but also has positive benefits for the 180 million individuals with speech differences, and for any individual who experiences speech disfluency, which happens to everyone.  This is an example of what is known as the curb cut effect, where improving the environment for those with a disability also improves functionality for all people.  One key to this that Dr. Ryan expressed was that we need datasets that represent all relevant users to the extent possible when training algorithms.

    Takeaway 2: Technology can expand what we assess, but can also introduce opportunities for bias

    The second dichotomy that the talk focused on was that technology has provided assessment professionals with new ways to assess constructs that are relevant to many different jobs.  Cutting edge research on the role of artificial intelligence in selection has shown the ability for artificial intelligence to assist in item development, provide better strategies for combining predictors, and reduce subgroup differences.  In providing novel methods for measuring constructs, there may be additional construct-relevance variance that can be captured using these strategies.

    On the other side, during the talk Dr. Ryan discussed the potential for these types of assessments to unintentionally introduce opportunities for bias in the assessment process.  For example, identity contingency cues may signal to an applicant how welcome they are within an organization by demonstrating how different aspects of their identity are represented and valued by the organization (or not).  More complex assessment environments offered by technology may make organizations more or less attractive to applicants depending on their reactions to the assessment and if they see themselves represented in those environments.  Having more cues available for applicants to interpret may not always lead to positive outcomes.  In general, it appears that applicants are more attuned to negative experiences than positive ones so even subtle negative cues may ruin an otherwise positive applicant experience.  This may lead to negative outcomes such as a negative reputation for the organization or a reduce likelihood that the applicant will accept a job offer.

    Takeaway 3: Technology needs a human in the loop which can amplify, reduce, or ignore human biases

    When the presenter discussed this dichotomy, two concepts stuck out when thinking through how automated decision making can be combined with human judgement.  The first concept was algorithm aversion, or the tendency for people to dislike when algorithms make decisions for them.  Dr. Ryan further explained people are generally accepting when algorithms make routine decisions with minimal consequences.  However, when major decisions are involved, such as the decision to hire someone into an organization, people are less likely to rely on information provided by an algorithm.  The second concept related to this topic is the autonomy-validity dilemma, or the tradeoff between humans desire for autonomy when making decisions and the superiority of mechanical combinations of information when making decisions.  As brought up during the question and answer portion of the talk, this mirrors findings in the employee selection literature that hiring managers have a strong desire for making autonomous decisions even when mechanical combinations of data may provide better prediction across a range of outcomes.  Dr. Ryan referenced an NSF grant that she is contributing to called the Trapeze Project.  The project is still in the early stages but aims to study many of these issues in detail so look out for future research in this area.

    Thanks to all of you who have taken the time to read this blog and a special shoutout to those who attended the talk and contributed their thoughts. Finally, I’d like to personally thank Dr. Ryan for an insightful and engaging presentation!  For others who attended the event, feel free to share your thoughts and takeaways in the comments below.  I hope to see you at next year’s PTCMW Fall Event!

  • 08/26/2023 8:13 AM | Anonymous

    Author: Kira Foley, Perceptyx

    Industrial and organizational psychologists are experts in measuring, diagnosing, and explaining human problems in the workplace. As a collective, we’ve helped build thousands (maybe millions?) of selection tests, performance assessments, and employee listening surveys. Our work has undoubtedly improved not only business performance but also the well-being of workers across the globe, all by bringing sound psychological theories and principles, psychometrics, and statistics to organizational decision-making. However, the true potential of the people analytics and Human Resources (HR) technology industry is stifled by one critical weakness: our products explain but don’t necessarily change behavior. In this blog post, I will explain why I am convinced that the behavioral economist’s idea of “nudges” offers a promising, scalable solution to our collective problem. 

    What are nudges?

    Nudges are behavioral interventions that require minimal action but help individuals act in their own best interest. Nudge theory assumes that we make many small decisions throughout our day and that the environment around us influences each of those decisions, for better or for worse. Nudges are environmental influences that make better decisions easier and more likely, whereas “sludge” is something in the decision-making environment that creates friction and makes good decisions more difficult. Because nudge theory is so focused on influencing decisions, much of the work in this area seeks to prevent common cognitive biases and heuristics in decision-making like “anchoring bias” or “the halo effect.” One important tenet of nudge theory is that nudges should be optional. For example, “putting fruit at eye level counts as a nudge; banning junk food does not” (Thaler & Sunstein, 2008). 

    There are many examples of successful nudging in practice. One notable example is the Amsterdam Schiphol Airport’s effort to reduce restroom cleaning costs back in the 1990s. They added pictures of flies to urinals and effectively nudged patrons to reduce spillage by 80%. In another example, Volkswagen used nudge theory to encourage more people to take the stairs rather than the escalator in a Stockholm metro station. They installed a working piano to the stairs that played a note with every step, nudging 66% more commuters than normal to opt for the stairs

    Around ten years ago, nudges started making their way into the workplace. Nudge theory has been adopted by HR and business leaders to encourage a variety of employee behaviors. However, it’s important to note that not all nudges are created equal and some nudge products on the market are not nudges in the scientific sense. For example, arranging desks to encourage socialization among coworkers is a nudge, but sending weekly email reminders to schedule more “heads down” focused work time is a notification not a nudge. The former intervention changes the environment to make a behavior (socializing) easier, whereas the latter intervention serves as an informative message or a request. What makes an effective workplace nudge can depend on the medium used to communicate nudges to employees, the timing of the nudge, and the extent to which the nudge is deployed ethically.  

    A 3-step recipe to creating your own nudge

    Now that you know a bit about what a nudge is, I’ve provided a simple framework below for developing an effective nudge with an example that is carried throughout the process to demonstrate how to apply the technique.

    1. Identify the decision you want to influence. 
    Example: I’m a project manager who’s about to start a new project and I want to get my team’s input on the project scope and goals. I plan to hold a live brainstorming session with my team, but I’m worried that the louder voices will dominate the discussion. To encourage dissenting opinions, I want to nudge team members who disagree to speak up.
    2. Try to really understand the decision-making environment. Ask yourself:
    a. What in the decision-making environment is fueling the desired behavior? Fuel is anything that makes the target behavior more appealing or likely.
    b. Is there friction in the decision-making environment preventing the desired behavior? Friction is anything that gets in the way of people performing the target behavior. 
    Example: The decision I want to nudge on is going to happen in a team meeting via a video conference call. One thing I know that fuels people speaking up is silence. Most people on my team are uncomfortable with silence and someone always speaks up to fill it. One thing that is likely to create friction that prevents those with different opinions speaking up is when many like-minded members publicly agree with each other. 
    3. Pick one small behavior you can perform to change the environment in a way that encourages (but doesn’t force) the decision you want others to make. Your behavior is the nudge so ask yourself what does an effective nudge look like, quantitatively and qualitatively?
    Example: Given how many unknown variables there are in the project team meeting itself, I will try to nudge my team before the meeting. To nudge members towards sharing their diverse opinions during the meeting, I will email the team the meeting agenda ahead of time so that everyone can choose to develop their own opinion before being influenced by the group discussion. My email with the meeting agenda acts as the nudge. A successful nudge will lead to at least one person voicing a contrasting opinion during the meeting and everyone feeling like it’s safe to speak up even when it seems like some of the team will disagree. 

    How could nudges solve our behavior change problem?

    Nudges, if both science-backed and applied ethically, can help us close the gap between just understanding workplace challenges and actually taking action to address them. Nudges do something unique from the behavior change interventions we’re used to. Unlike more targeted (and expensive) interventions like training or coaching, nudges are scalable. They don’t take too much of the employee’s time, and they can be used consistently over time to inspire lasting behavior change. In this way, nudges can help us change, not just explain workplace behavior at scale. 

    However, the truth is that your average I/O knows very little about nudge theory and probably even less about the science of behavior change (at least the way it’s been done by behavioral economists and social psychologists). To be successful, we need to collaborate with others in the behavioral and social sciences, both in research and practice. We as I/Os deeply understand the human experience at work. Nudge experts like behavioral economists deeply understand human decision-making. Together, we could tackle the most pressing issues like manager burnout, bad bosses, and layoff anxiety

    Author Bio: Kira Foley is a behavioral and social scientist passionate about the power of applied research to solve real-world problems. She earned her PhD in Industrial-Organizational Psychology from The George Washington University, where she published research on a variety of workplace challenges that affect leadership and teams in a variety of organizational contexts (U.S. Army Soldiers, women leaders, desk-less workforce). In her current role as a Behavioral Scientist at Perceptyx, she sits on the product team and plays a first-hand role in baking behavioral science into HR technology Software as a Service (SaaS) products.

  • 07/11/2023 5:23 PM | Anonymous

    Author: Amanda Allen, IPAC Conference Program Chair

    The 2023 IPAC Annual Conference is right around the corner and is being held in Washington, D.C., July 23-26 at the Capital Hilton.  For those readers not as familiar with IPAC, it is the International Personnel Assessment Council (formerly part of IPMAAC) and our members include HR directors and managers, specialists in staffing, recruiting, and organizational performance management, psychologists, attorneys, management consultants, academic faculty and students, and others.  Our members represent the private sector, public sector (including local and federal governments), and nonprofits.  We all share professional interests and expertise in the development and effective use of HR selection and assessment methods.

    Program Highlights

    At this year’s conference, we are lucky to be located in the heart of D.C., near the White House and tons of restaurants, bars, and shopping.  This is the second in-person conference IPAC has held since the start of the pandemic, and we have a great program planned.  This year’s submitted content ranges in topics from job analysis and test development and scoring to leadership assessment and workforce and succession planning.  Prior to the start of the conference, we have two pre-conference workshops that provide more hands-on learning opportunities on two important topics: 1) conducting barrier analysis and 2) developing Behaviorally Anchored Rating Scales (BARS). 

    New this year, we have a special topic track that includes invited speakers who will present on different aspects of Artificial Intelligence (AI) in assessments.  Because this has become such a hot button topic with recent guidance from the Equal Employment Opportunity Commission (EEOC) as well as interest from Congress and the White House, we wanted to provide our attendees with up-to-date information as this area evolves.  We have five presentations from experts in academia, government, and consulting that cover the following topics:

    • AI-based assessment regulations/enforcement
    • Recommendations and actionable strategies for employers and AI-vendors from the AI Technical Advisory Committee (TAC)
    • Practical application of AI to augment existing job analysis procedures
    • In-depth discussion of the good (benefits), bad (potential risks and pitfalls), and ugly (challenges) aspects of generative AI for selection and assessment
    • A review of current research on AI

    In addition to the special invited speaker track, we have some well-known and exciting keynote speakers who will be talking about various assessment topics.  You’ll hear from:

    • Elaine Pulakos, Ph.D., CEO of PDRI, on leadership and organizational resilience;
    • Juliet Aiken, Ph.D., consultant with Volta Talent Strategies and Head of Consulting at Conducere, on leveraging assessment technology in an equitable way;
    • Victoria Mattingly, Ph.D., CEO and founder of Mattingly Solutions, on effective measurement in Diversity, Equity, and Inclusion (DEI) efforts;
    • Eric Sydell, Ph.D., entrepreneur and consultant, on the implications of AI for talent acquisition and the broader tech landscape; and
    • Elizabeth Kolmstetter, Ph.D., Chief People Officer at the Cybersecurity and Infrastructure Security Agency (CISA), on assessing capabilities and culture in organizations. 

    You won’t want to miss these presentations!  See the conference website for more information.

    Social Events

    As if that is not enough to convince you to attend the 2023 IPAC Annual Conference, we also have a lot of social and networking opportunities planned throughout the three days.  For those of you who may not have attended an IPAC Annual Conference previously, it tends to be more of an intimate setting with lots of opportunities to connect with other attendees.  To facilitate this, we host a hospitality suite each night of the conference where attendees can mingle and connect with each other in a relaxed setting.  In addition, we offer a welcome reception to kick off the conference on Sunday night and a social event – with drinks and snacks - on Monday night.  First-time attendees can also participate in a coordinated dinner on Tuesday night designed to facilitate connections at the conference.  Check the conference website for more details on these events.

    Lastly, we couldn’t put on such an event without our generous sponsors!  You’ll see many of them throughout the conference, but please check them out on our website as well to find out more information now.

    Discounted Conference Registration

    IPAC has decided to extend the registration discount for the 2023 IPAC Annual Conference to all PTCMW members because we are sure some members did not attend the Happy Hour event on June 8th due to the extreme smog that day.  The code below can be entered when registering and will work until the end of the conference. 

    PTCMW Members – $350.00

    Code = ptcmwhappyhour2023!

    We hope to see you at the conference this year!  Come join us in Washington, D.C.!

    Author Bio: Amanda Allen, Ph.D., is an Industrial and Organizational (I/O) Psychologist and Senior Consultant at DCI Consulting Group. As a member of the Employment & Litigation Services Division, Amanda provides consultation to clients primarily in the areas of employee selection and assessment.

  • 06/23/2023 2:01 PM | Anonymous

    Author: Matt Dickson, Capital One

    There’s little dispute that data drives the world. Sessions of Congress, major court cases, and some of the largest companies in the world exist because of the power of data. Regardless of the opinions we may have around data privacy, ownership, and the ethical use of data, it’s hard to deny that data is the backbone of I/O psychology. By combining data analytic skills with the inherent background our field has in organizational science, I/O psychologists are uniquely positioned to make meaningful differences within organizations, and drive innovation within this ever-evolving field.

    Data is omnipresent in the world of an I/O. It’s data that helps prospective job-seekers find the roles best matched to their own skills and abilities; it’s also data that enables organizations to efficiently navigate the costly world of hiring the best possible personnel they can. Data is the ally that helps raise issues of disparate impact or discrimination in the workplace to the forefront, be it with regards to pay, performance, or any other measurable outcome. Data is everywhere in the world of an I/O, but it isn’t everything. It must be coupled with a conceptual, as well as technical knowledge of how to understand, manipulate, and analyze it…otherwise you’re left with just numbers on a screen.

    Generally, I/O psychologists tend to be pretty good with numbers. You’ve probably opened Excel more than a few times in your life, hand-calculated some inferential statistics a time or two (or perhaps that’s a distant stats class nightmare), and whether it’s your best friend (or frenemy), you at least know what SPSS is. That said, in order to practically work with data as an I/O, it isn’t always as simple as it was in graduate school. Later in your career, you might find yourself saying things like:

    • “If I try working with a file this big in Excel, it crashes!"
    • "What does this part of the output mean again?"
    • “I got laughed out of the room when my IT department saw what SPSS costs!”

    Even getting access to - let alone working with - data as an I/O in the real world comes with a bevy of challenges. In fact, I would argue that I/Os are not consistently equipped to handle data as part of their formal training. Fortunately, there are ample knowledge, skills, and abilities that we can leverage from the world of data analytics.

    Data Manipulation and Analysis: Data analysts excel at gathering, organizing, and analyzing large datasets. They have expertise in statistical methods, data querying and cleaning, and languages such as SQL, Python, or R. These skills enable them to extract meaningful insights from complex data, identify relationships, and generate predictive models. I/O psychologists, by training, are well-versed in statistical methods and are familiar with working with data. However, there is often a gap when it comes to manipulation of data and combining data sources to make it usable (e.g. joins, interacting with data warehouses, knowledge of data types).

    Data Visualization: Communicating data-driven findings effectively is essential for driving change within organizations. Data analysts are skilled at creating visually appealing and informative charts, graphs, and dashboards that convey complex information in a concise and understandable manner. This skill is invaluable for I/O psychologists in presenting findings to stakeholders and facilitating data-driven decision-making. While I/O psychologists are able to create data visuals that communicate findings, there is often a challenge in doing so in a way that communicates a clear and powerful (but still accurate) message to colleagues who may not be as familiar with data analysis or statistics.

    Machine Learning and Predictive Modeling: Data analysts are knowledgeable about machine learning techniques, allowing them to build predictive models based on historical data. These models can help I/O psychologists anticipate future trends, identify potential issues, and make data-driven recommendations to improve organizational outcomes. Most I/O psychologists are familiar with the concepts of logistic and linear regression, and these are often keystone algorithms in the work we do in both applied and academic settings. However, there is not a ton of exposure to unsupervised machine learning algorithms, more focused on identifying hidden patterns in data. Factor analysis is a great example that I/O psychologists are likely familiar with, but increasing the exposure I/O psychologists have to other similar methodologies will only add to what the field is capable of.

    However, acquiring these skills does not always require additional coursework, schooling, or even additional financial investment. The internet has afforded us so many resources, enough where one can go from being a novice to data wizard, all with some time, structure, intentional searches, and a lot of practice!

    Here are some tips for upping your data analytics skills, coming from someone who primarily developed these skills through self-learning and independent practice and leverages this skillset daily within my role in People Analytics at Capital One:

    1. Learn SQL: I can speak firsthand to the challenges in working with data, especially large amounts of organizational data without first having a basic knowledge of SQL (Structured Query Language). SQL is the fundamental language used to extract or query data. It is often how I/O psychologists will retrieve existing data, often from a server or data warehouse. You don’t need to be a SQL expert, but spending an afternoon learning the fundamental commands, and then practicing them as much as possible, will make the rest of your data journey infinitely easier. YouTuber Shashank Kalanithi has a great one-hour tutorial video of getting started with SQL, which also includes accompanying notes reviewing the most essential commands you’re likely to use most often.
    2. Reduce, Reuse, Recycle: As you build upon your experience in working with programming languages like R or Python, the more you’ll start to recognize that reusability of code (or entire scripts) is something to optimize for. While innovation is invaluable, it doesn’t make sense to reinvent the wheel all the time. Crafting code with a mindset towards scale and reusability will save you a lot of time in the long run. One strategy I’ve found help here is to keep a central repository where I’ve written functions that are useful across many contexts to reference when writing new code. Another is to annotate your scripts as much as you can! This will make it much easier to revisit and identify reusable parts of code later on, that maybe didn’t seem as valuable at the time.
    3. Learn to Search: The languages that analysts use to work with data are vast, and the packages that are developed for them only add to the depth and scope of what they can do. With that said, it’s highly unlikely that you will learn and memorize every possible command or feature that a language has to offer. Therefore, it’s super important to learn how to search for things. Many packages have built-in help documents, but oftentimes, the internet is the best source for answering the more specific questions you’re more likely to have. There will be a lot of trial and error, but take note of how you phrase things when you search for how to do something - there’s an art to effectively searching for code resources, and that will come through practice.
    4. Online Tutorials are Fantastic: Online tutorials, in my experience, have served a dual-purpose. For starters, they are great ways to get step-by-step demonstrations on how to work with a particular tool or method. They are also great ways to pick up “cheat codes” and learn other’s tips & tricks. In fact, I am hard-pressed to remember a time where I didn’t watch a tutorial and pick up a trick on how to approach a certain situation differently or apply a tool or technique in a way I wouldn’t have thought of before. You often get more than what you seek when it comes to these tutorials. Some of my personal favorites have been Shashank Kalanithi and David Robinson for some great R webinars, and Kevin Stratvert for all things Excel (just to name a few).
    5. Hit Those Keyboards: I appreciate you taking the time to read this, and I’m sure creators like those mentioned above also appreciate the time you spend watching them work with data, but you won’t get much value out of any of these resources until you get in there, start writing some code, and create some error messages. A quick search will turn up several free datasets to work with, but good starting options are the “mtcars” and “ToothGrowth” datasets for R (both built into the base version). Python users will get access to several included datasets by installing the “seaborn” package (e.g. “flights”, “fmri”, “penguins”, and “titanic”). With data so prevalent in the world around us, pick a topic area that interests you, and you will likely find a dataset to start working with.  Some of my favorites include Sean Lahman’s baseball database and this Star Wars survey dataset collected by the folks at FiveThirtyEight.

    There is an incredible amount of value in being a data-savvy I/O psychologist. Being effective in speaking both the language of I/O, as well as the language of data is an incredibly rare skillset today, but one that will become more necessary as data becomes an essential, and possibly foundational element, in our lives at work and at home.


    Author Bio: Matt Dickson is a member of the Talent Assessment team at Capital One.  Sitting within People Strategy and Analytics, Matt draws from selection expertise and analytical skills to inform data-backed assessment and selection recommendations across the enterprise.
  • 04/16/2023 3:45 PM | Anonymous

    Author: David Swiderski, PTCMW Blog Editor

    There has been an increase recently in the discussion of the use of artificial intelligence (AI) in the personnel selection space.  One indication of the impact of this area on the field of I/O psychology is the recent announcement of the upcoming 2023 SIOP Leading Edge Consortium (LEC) on Talent Assessment Strategies of the Future with a focus on “new applications of AI to assessment development and scoring”.  The intent of this blog is to provide a brief roundup of recently published practical guidance on the use of AI in personnel selection decisions. 

    AI has not been a focus of traditional education and training in industrial and organizational psychology but is an emerging force affecting how we understand and predict people’s performance at work.  It is imperative that those working in this space understand the implications of using these techniques and how they can be applied appropriately.  This is especially true in high-stakes contexts that can have a significant impact on individuals’ careers. 

    The following documents can serve as a starting point for those considering the development, validation, implementation, and maintenance of AI-based assessments.

    The Institute for Workplace Equality. (2022). Technical Advisory Committee report: EEO and DEI&A considerations in the use of artificial intelligence in employment decision making. Link

    The Institute for Workplace Equality is a non-profit organization that strives to educate businesses on diversity and inclusion practices and equal employment opportunity compliance.  The Institute commissioned a 40-member Technical Advisory Committee (TAC) from domains such as industrial and organizational psychology, psychometrics, data science, economics, and employment law.   The TAC’s in-depth report approaches the topic of AI in employment decision making from statistical, ethical, and legal perspectives by aggregating information across the diverse group of contributors.  Additionally, each section of the report was shaped by the results of a survey of TAC members. Some of the topics covered in the report include:

    1. Where AI is most prevalent in employment decision making.
    2. Privacy and fairness issues around how AI data is collected, used, and stored.
    3. How the concepts and principles described in the Uniform Guidelines on Employee Selection Procedures apply to AI-based assessments.
    4. Analyzing adverse impact in the context of the unique challenges presented by using AI-based assessments.

    Society for Industrial and Organizational Psychology. (2023). Considerations and recommendations for the validation and use of AI-based assessments for employee selection. Link

    Undoubtedly, most readers of this blog will be familiar with the Society for Industrial and Organizational Psychology, the premier professional association for I/O psychologists.  In addition to the LEC mentioned above, SIOP has acknowledged the growing interest in AI by releasing a statement in January 2022 on the use of AI in hiring and offering a webcast on the topic featuring SIOP members with expertise in this area.   The most comprehensive response to date from SIOP in this arena has come in the form of a document from SIOP’s Task Force on AI-Based Assessments.  The document details recommendations from the Task Force on developing, validating, and using AI-based assessments.   The authors discuss how different elements of AI-based assessments relate to the concepts and procedures discussed in SIOP’s Principles for the Validation and Use of Personnel Selection Procedures.  The document delves into issues such as considerations for collecting validation evidence and how the collection of training data in AI-based assessments can influence their scoring and outcomes.  The document concludes with a discussion of the information that should be documented as part of the development and validation of AI-based assessments.

    Landers, R. N., & Behrend, T. S. (2022). Auditing the AI auditors: A framework for evaluating fairness and bias in high stakes AI predictive models. American Psychologist, 78(1), 36–49. https://doi.org/10.1037/amp0000972

    Tara Behrend and Richard Landers have been prominent academic voices on the intersection between technology and human behavior in the workplace for over a decade.  Their recent article in American Psychologist focuses on the ethical use of AI through a set of guidelines for conducting audits of AI systems.  The authors broadly define high-stakes decisions beyond hiring individuals in organizations to many decisions made across different domains of psychology.  The example presented throughout the article focuses on a situation in which an AI algorithm is used to score interview responses.  The article highlights 12 components of AI systems that should be included in an audit and identifies important considerations to think through in the evaluation of each component.  Finally, the article concludes with an emphasis on the importance of collaboration and having an interdisciplinary lens for the adoption of sound AI auditing practices in the future.   

    Tippins, N. T., Oswald, F. L., & McPhail, S. M. (2021). Scientific, legal, and ethical concerns about AI-based personnel selection tools: a call to action. Personnel Assessment and Decisions, 7(2), 1-22. https://doi.org/10.25035/pad.2021.02.001

    The final document in this roundup is a journal article that outlines 11 concerns with using AI from three prominent figures in the personnel selection field.  The concerns touch on many different points in the assessment development and validation process.  The article also contains lists of questions that researchers and practitioners should take into account in light of the concerns stated in the article.  Finally, the article ends with a call to action encouraging I/O psychologists to play a central role in this space by considering how our professional standards and principles apply to AI-based assessments.  Similar to Landers and Behrend, the authors call for using an interdisciplinary approach to make progress in this area.

    Note that this list is a brief roundup of several recent committee and refereed journal-produced documents. We encourage interested readers to be on the lookout for additional professional and scientific developments related to this space. If you would be interested in contributing your perspective to a future blog on AI in the workplace, please reach out to blog@ptcmw.org to discuss your ideas.

  • 02/22/2023 9:04 PM | Anonymous

    Author: Shelby Joseph

    Hello everyone. My name is Shelby Joseph and I am in my final semester at George Mason’s Master’s of Professional Studies (MPS) program in Industrial and Organizational (I/O) psychology. The following is my reflection regarding the PTCMW Graduate Student Consulting Challenge (GSCC).

    While in school, I sought out opportunities to build and develop my I/O competencies through extracurricular activities, which led me to connect with peers through local I/O gatherings. When PTCMW’s Graduate Student Consulting Challenge was first revealed to me through a peer in my program, I knew this would be an opportunity to challenge myself. Unfortunately, I had missed the registration window to participate in the GSCC in 2021. Registration for the GSCC is yearly and begins in October. I waited to apply for the 2022 GSCC and was thrilled when I was accepted to participate because at this point, I had completed most of my core I/O coursework and I knew this was my opportunity to test all I had learned up to that point. I expected the GSCC was going to allow me to experience what it is like to be an I/O consultant.

    I knew preparation for the virtual final presentation would be intense based on the prior GSCC experiences of my peers. For example, I was told that preparation would require a lengthy time commitment during the competition as students are only allotted 3 days to respond to a proposal and they would need to come up with a focused plan to complete the response by the deliverable deadline. While this expectation was true of my experience, it was also exhilarating! During the initial GSCC meeting, PTCMW event coordinators and representatives from the sponsoring organization introduced our Request For Proposal (RFP). The RFP served as our guide to responding to stakeholders, who in this situation were judges from the sponsoring organization. Our goal was to address an organization’s needs and outline the resources required to meet this goal. My group consisted of 3 students, all in different phases in our I/O programs. One of us was a Doctoral student, another was a Master’s student preparing to graduate, and myself, who at the time had four more courses to complete before graduation.

    What I liked most about the GSCC was my experience collaborating with my teammates. One of the reasons I believe our team succeeded in such a tight timeframe was because our collaboration was quick, efficient, and effortful. My teammates met with me immediately after the initial GSCC meeting to quickly review our approach to the RFP on Microsoft Teams. The RFP for the 2022 GSCC was focused on personnel selection. During the meeting, I created a shared Google Document to begin addressing the RFP. We also reviewed the RFP instructions to plan our tasks. Likewise, we agreed to reserve a large portion of our time in the evenings to collaborate and complete tasks. Our goal at this point was centered on the creation of a detailed report to address the RFP. The first two evenings were dedicated to completing our formal proposal report and appendices with supplemental material, as we agreed this would require the most work.

    During the first two evenings of the competition, we stayed on a Microsoft Teams call for 5 to 6 hours with our cameras off as we took on pieces of the RFP. To split up the tasks, we took on work that we felt competent in or had recently completed in our programs. For example, one teammate was teaching a course on a job analysis and took on the material to cover this section, another had completed a job analysis at her job recently and had practice materials we could use. I had just completed my data analytics course and helped to tackle statistics-based questions such as an analysis plan for how we would demonstrate the fairness of our proposed solutions. Throughout the call, we would give updates on our completed tasks.  During moments of  minor disagreement we used rational debates and a group vote to obtain consensus on a way forward.

    The day before the presentation we spent 4 to 5 hours curating our presentation slides. We also organized a shared script on a Google Document so that we knew when it was our cue to speak about our assigned portion of the presentation. By the end of the third day, all members had taken on an equal workload, including the speaking portion of slides. On the day of our presentation to the judges, my team members and I conducted a practice presentation one hour ahead of time to build our comfort in presenting all of the material.

    Despite the projects I succeeded in while at George Mason, it wasn’t until the GSCC presentation to the judges that I realized the barriers organizations must navigate that are not always explicitly stated in a RFP that we as I/O practitioners must consider. This is where the GSCC allowed me to practice the skill of persuading key decision makers who may not have an I/O background. When presenting in front of a panel of 7 to 10 judges, it was apparent that we needed to deliver a compelling case. The judges’ behaviors mimicked stakeholders and high-level leaders concerned with results that will exceed their organization’s goals.

    During the presentation, we did not know the exact questions the judges would have concerning our technical approach. The ability to address the judges’ questions was parallel to what I would expect in an actual consultation with a business leader or stakeholder. My group did our best to anticipate questions by identifying beforehand the advantages and disadvantages of the solution we proposed.

    An example of a particularly challenging question was related to budgeting as none of us had experience creating budgets for a business before. One thing that helped us navigate this question was a prior discussion during our preparation that my team had on the disadvantage of proposing a large budget. This required us to determine why the return on investment was likely more beneficial in the long run and not a risk to the organization. While there was no textbook answer to address this, our team explained our decision to the judges by providing our rationale for how we came up with a budget. This included identifying base hourly pay rates for certain positions and estimating the number of subject matter experts needed to implement procedures that would result in the successful implementation of our proposed solution. This experience of providing on the spot responses and present a compelling case to the judges showed me that not all answers are simple or found in a textbook.

    After the evaluative portion of the presentation was over, the support and feedback from the judges was encouraging because it helped remind us that this was a learning experience. Once the presentation ended, I recall the moment my team and I reflected on how much we learned through applying the knowledge and skills we gained while in our respective I/O programs. This experience helped to reinforce that learning is more than just explicit knowledge, it requires practice and application. When it was announced that we won the GSCC, I knew it was because of our seamless collaboration with one another.

    Since my participation in the GSCC, I now have a clearer understanding about what competencies are necessary to be a successful consultant. While having the foundational I/O knowledge is necessary, I learned it is just as important to explain what we know in a way that can be understood by our stakeholders. I also learned that it is especially important to not only provide solutions, but tailor our solutions in a way that aligns with an organization's business values and mission. This experience helped me to showcase my I/O knowledge in a way that was meaningful and easy to understand for those without an I/O background.

    Additionally, I have leveraged the relationships resulting from my participation in the GSCC to expand my professional network within the field. Competing and winning the GSCC helped open internship opportunities that I would not have applied for had I not participated. For example, winners of the GSCC were provided an invitation to the PTCMW Fall Event. Attending this networking event helped me to connect with more experienced colleagues in a welcoming setting and also gave me a chance to familiarize myself with available full-time I/O positions. I have since obtained an internship with an organization that I was able to talk to during the Fall Event.

    If you have completed your core I/O coursework and would like a realistic job preview of being an I/O consultant, I urge you to participate in the GSCC! The skillsets you gain from collaborating with peers, defending your decisions to experts in the field, and presenting in a manner that can be understood by your stakeholders will likely assist you as you set forth in your I/O career. No matter where your I/O trajectory leads, I hope you are inspired to participate in the GSCC!

  • 10/31/2022 6:08 PM | Anonymous

    Author: David Swiderski, PTCMW Blog Editor

    Call for The Point Blog Contributors

    The Point Blog is the official blog of the Personnel Testing Council of the Metropolitan Washington area (PTCMW), an organization dedicated to advancing the science and practice of industrial and organizational psychology through high-value professional growth and networking opportunities. The Point Blog strives to inform, educate, and entertain our membership and the broader community while maintaining our position as a creative, inclusive, and curious voice in the I/O psychology blogosphere.

    The content that will reside on the blog will focus on the scientific study of human behavior in the workplace. Potential content areas will include topics typically discussed in a graduate industrial and organizational psychology program or at the Society for Industrial and Organizational Psychology (SIOP) annual conference. We aim to publish on a variety of topics within the field and encourage writers from diverse backgrounds to submit their ideas to expose our readership to a range of perspectives on the issues confronting our field. Submissions can come from members or non-members of PTCMW. Graduate students in industrial and organizational psychology or related fields are encouraged to contribute to the blog by submitting ideas for posts they would like to write on the blog. Leveraging our position as an evidence-based field, claims and arguments made on the blog should be backed by rigorous methodology and critical thought.

    What Makes a Good Submission?

    Submissions should be made via a Microsoft Word document submitted to blog@ptcmw.org. Submissions may come in the form of a quick summary of a research program or project, an applied approach to solving an organizational challenge, an opinion piece on an issue that the field is facing, or a perspective on a professional development experience that others would benefit from hearing. Submissions outside of these broad guidelines will be accepted but should be focused on topics related to industrial and organizational psychology. Examples of topics for post submissions include ideas like:

    • A short summary of the findings from a recently published article examining “return to work” policies on employee engagement
    • A high-level overview of a novel approach to a high-volume pre-hire assessment context within an organization
    • A post challenging common assumptions about work motivation
    • A personal account of formal and informal experiences learning a statistical programming language such as R or Python

    When submitting an idea for The Point Blog, it is not necessary to submit a fully written post, but we should have enough information to evaluate whether it would be a good fit for the blog. A submission should be at least a paragraph in length (4-5 sentences), although longer submissions will be accepted. Please include the following elements in your submission:

    • A summary of the main point of your post that includes any key takeaways that you will want readers to remember. This should include why readers would be excited to view your post and why it matters in the broader context of the field of industrial and organizational psychology.
    • A brief description of the types of information you will be using to back your claims or tell your story. If you plan to include any tables or charts, include a description of how you’d like to present this information and how it supports your story (e.g., “I’d like to include a chart that shows the distribution of performance management ratings collected using two separate methodologies to demonstrate the impact of our new approach to performance management.”)
    • A short post author biography and an email that we can use to contact you regarding your submission.

    You are welcome to submit more than one idea for a blog post or to frame your idea as a series of blog posts, but please make it clear in your submission if this is the case.

    What to Expect from Us?

    As blog editors, it is our job to support you in a way that is transparent, efficient, and quality-focused. We will aim to make decisions on whether to proceed with turning a submission into a blog post within two weeks of receipt of the submission. If you do not receive a follow-up or acknowledgement from blog@ptcmw.org within two weeks of sending in your submission, please assume we are not interested. Editing times may vary depending on the content of the post and resources available to support the editing process, but in general we aim to publish content at least once per month.

    Throughout the editing process, we’ll be thinking about the following:

    • How does the post align with the objectives of The Point Blog and with PTCMW?
    • Is the idea worthy of professional discourse (i.e., could the post be the focus of a 10–15-minute conversation with a colleague at school, work, or other professional settings)?
    • How does the piece reinforce, add to, or reshape public knowledge on the content area?
    • Is there a clear takeaway or a clear sense of the story being told?
    • How does the evidence submitted illustrate the points being made?
    • How well does the writing adhere to the style guidelines and grammatical conventions?

    Style Guidelines

    To create a unified voice and appearance for our readers, there are a few things you should try to keep in mind while writing a post for The Point Blog.

    • Blog posts will follow the latest American Psychological Association (APA) style guidelines for in-text citations and references.
    • Given the debate and diversity surrounding the name of our field, we believe it is important to note that any references to the field should be written as “industrial and organizational psychology” and should be abbreviated as “I/O psychology”.
    • Avoid gendered terms (e.g., “mankind”) or terms that assume a person’s gender (e.g., use “parenthood” instead of “motherhood/fatherhood”).
    • Avoid ableist language (e.g., “tone-deaf”, “blind to…”, “crazy”). These terms are exclusionary and could likely be said in a different way.
    • When giving examples, use “e.g.” to reference an incomplete list that is part of a larger list of items. Use “i.e.” when restating a phrase to clarify an earlier statement.
    • Abbreviations and acronyms: Always spell these out on first reference. Include a parenthetical of the acronym if you think it won’t be obvious for the reader.
    • Numbers: Outside of tables and figures, write out the numbers one through nine. Use numerals for numbers 10 and above unless they’re the first word in a sentence.

    Thank you for taking the time to consider submitting to The Point Blog and we hope to bring our readers content that will enrich their professional development and strengthen the connections among those in our community.

  • 03/01/2022 8:18 PM | Anonymous

    Dear PTCMW Communications,

    Happy New Year PTCMW Members and Friends! The 2022 PTCMW Board is excited to kick off another successful year for PTCMW. Please keep reading for some important updates from the Board.

    Monthly Educational Sessions

    We held our February virtual panel career discussion on February 15, 2022, that focused on offering career advice to students and early career professionals. We want to thank the University of Maryland Baltimore County (UMBC) I/O Psychology Graduate Program and Blacks in I/O Psychology for co-sponsoring the event; as well as our panel members Jeffery Godbout (ICF International and the Global Organization for Humanitarian Work Psychology), Shavonne Holman (Blacks in I/O Psychology), Mike Litano (BetterUp), Shyriah Marshall (Blacks in I/O Psychology and Marshall Career Consulting), and Kathy Stewart (U.S. Customs and Border Protection). With nearly 50 in attendance, it was a tremendous success! Just a reminder that members can access the recordings of any of our 2022 (and earlier) sessions in our webinar library.

    Our next monthly presentation will be on March 23, 2022, at 5:30pm ET. Laura Fields (Spectrum) and Chantale Antonik (Modern Hire) will present a session titled, “Using Selection Science to Source Talent and Identify Fit.” Registration will open soon.

    PTCMW is continuing to provide members and non-members the option of attending our monthly programs without being charged the normal fee – in light of COVID-19.  To attend the session without charge, simply email the secretary (secretary@ptcmw.org) to receive a code for registration.

    Member’s Corner Now Live!

    I am excited to announce the launch of the Member’s Corner page on the PTCMW website! The Member’s Corner was developed based on input received from the member survey distributed in 2021. As a PTCMW member, you have access to mentoring resources, exclusive access to recordings of previous monthly speaker sessions and events, and a member directory for networking and information sharing.

    Member Event Survey

    As we continue to navigate the uncertainty of COVID-19, we want your input for how to best continue to provide our members with educational content and networking opportunities. Please take a few moments to complete this survey so we can gauge interest in both formal and informal events, virtual and in-person. Your feedback will be used to make decisions for the rest of the 2022 event calendar. Complete the survey by visiting the Member's Corner here.

    Get to Know your 2022 PTCMW Board Members – Spotlight on Phil Walmsley, President-Elect

    With the new year comes new PTCMW Board Members. In each President’s message for 2022, we will spotlight a Board Member, so you can get to know them.

    I am a Lead Personnel Research Psychologist in the Selection and Promotion Assessment section of the U.S. Office of Personnel Management’s (OPM) Human Resources Solutions (HRS) division. On behalf of federal agencies, I conduct job analyses, develop assessment and measurement strategies, evaluate the use of a variety of pre-employment and leadership assessments, deliver training sessions, and conduct large-scale data analyses. I also serve as an advisor on the design of technology systems used for talent acquisition across many agencies. This has given me the chance to participate on teams composed of HR and staffing experts, web programmers, UX designers, data scientists, and multi-organization user groups. A substantial portion of my client-focused work has focused on law enforcement and public safety occupations, but I have had the good fortune to collaborate with people working toward a variety of missions across the federal sector.  

    I previously worked in the Personnel Research and Assessment Division of U.S. Customs and Border Protection, which is a group with a long history of operational and scholarly achievement. I try to present and publish work regularly and am a member of the editorial board of the International Journal of Selection and Assessment. I received an M.S. in Industrial-Organizational Psychology from Missouri State University and a Ph.D. in Industrial-Organizational Psychology from the University of Minnesota. In 2022, I received the Society for Industrial-Organizational Psychology’s Distinguished Early Career Contributions-Practice award. I am happy to have the opportunity to serve as PTCMW's current President-Elect and am looking forward to engaging with our community. 

    Outside of work, I enjoy exploring the history of the DC area and Alexandria, VA, where I reside with my wife. I’ve visited many of the U.S.’s national parks, and try to check out and play live music when I can.

    Call for Nominations: Bemis Award

    The PTCMW Board would like to hear your recommendations for this year's nomination for the Bemis Award. Recommendations for the Bemis Award nominee need to be sent to president.elect@ptcmw.org by Monday, March 14, 2022.

    I am very excited to serve the board as President in 2022 and look forward to all the exciting things we will accomplish this year! We look forward to seeing you at one of our events this year.

    Thank you,

    Marni Falcone, 2022 PTCMW President


<< First  < Prev   1   2   3   4   5   ...   Next >  Last >> 

2021 © PTCMW

Powered by Wild Apricot Membership Software