Ongoing SIPS projects
 Share
The version of the browser you are using is no longer supported. Please upgrade to a supported browser.Dismiss

 
View only
 
 
Still loading...
ABCDEFGHIJKLMNOPQRSTUVWXYZAA
1
Ongoing projectsOSF project pageCommunity members involvedSummary of projectImmediate GoalsOther important link/information
2
Creating network of contacts at journals/societiesosf.io/6t4gcDavid Mellor (david@cos.io)Use the professional contacts of SIPS members to encourage psychology journals to award open-science practices badges (i.e., pre-registration, open data, open materials).1. To have SIPS members complete a Google form listing individuals you know who are in positions of influence on psychology journals and/or societies (e.g., journal editors, members of publications committees, etc.) and to whom you are willing to send an email highlighting the benefits of such badges or TOP Guidelines.

2. Integrate responses to create a spreadsheet indicating who is going to contact whom; later, email out assignments with template language for this outreach.
Spreadsheet of journals, editor contact information, TOP and Badges status:
http://goo.gl/AB2WnE

Email template: goo.gl/IM252u

Form to submit contacts:
goo.gl/forms/Jwv4bWdF0U9jsy0D3
3
Limits of Generalizability Statement (LOGS) white paperosf.io/cgmn5Steve Lindsay (slindsay@uvic.ca), David Mellor (david@cos.io), Yuichi Shoda (yshoda@uw.edu), Dan Simons (dsimons@illinois.edu)To write guidelines for all "empirical papers explicitly [to] state the range of variations (in stimuli, participants, etc) under which the reported findings are expected to occur. This should help specify what constitutes a replication, and encourage investigation of boundary conditions"1. Write a white paper describing the rational behind SLOG as well as include sample SLOGs (this has been completed and viewed at osf.io/cgmn5)

2. Request and incorporate feedback from AEs and editors from SIPS
Current draft of white paper: osf.io/cgmn5
4
Creating a guide for individual labs on how & why to improve workflow documentationosf.io/f529nMike Frank (mcfrank@stanford.edu), Lorne Campbell (lcampb23@uwo.ca), Daniel Yurovsky (yurovsky@stanford.edu)Help labs manage their workflow by creating a step-by-step guide and excel template to organize the research process at each stage. 1. Write guide: "Improving Workflow For Individual Labs And Researchers"

2. Disseminate guide and organization template
Current draft of guide: osf.io/ywav9

Current draft of organization template: osf.io/n2jx5
5
FAQ on data sharing for researchersosf.io/v2s7dMichelle Meyer (michellenmeyer@gmail.com), Matt Spitzer (matt.spitzer@cos.io), Jake Westfall (jake.westfall@utexas.edu, Rita Ludwig (rludwig@uoregon.edu), Grace Binion (ghicks7@uoregon.edu)Standardize the practice of researchers uploading their data/code books in a repository with privacy setting set to self or lab/department.1. Produce and disseminate tutorial document on exactly HOW and WHY to incorporate data uploading as part of the normal work flow.

2. Produce and disseminate videos that walk through data sharing process
Data sharing FAQ outline: https://goo.gl/lELbOL
6
Curating resources for teaching & training best practicesosf.io/ehpt4Victoria Savale, Courtney Soderberg, April Clyburne-Sherin, Daniel Lakens, Catherine O'Dell Fritz, Sean C. Rife, Sean C. Rife, John Kitchener Sakaluk, Cristina Baciu, Julia M. Rohrer, Matthew McBee, Jack Arnal, Benjamin Brown, Laura SchererBuild a centralized list of teaching materials related to best practices and open science to increase discovery. 1. Continuously add to a list of resources that are organized by type (video, manuscript, blog, etc.), difficulty level, author, etc.

2. Integrate list in to the SIPS website
Link to resource list: http://goo.gl/yBEm6O
7
Sample job ads highlighting open scienceosf.io/atnhxSanjay Srivastava (sanjay@uoregon.edu)Improving hiring practices to endorse open science and best practices during the hiring process.Creating model job ad language for search committees. To be posted on the SIPS websiteModel job ad language: osf.io/24xr3
8
Tips for job candidates for highlighting open science practicesosf.io/atnhxDavid Condon (emailcondon@gmail.com), Anita Eerland (anita.eerland@gmail.com), Hannah Moshontz (hmoshontz@gmail.com), Hannah Moshontz (hmoshontz@gmail.com), Elliott Kruse (elliott.kruse@owen.vanderbilt.edu) Help job candidates that are involved in open science projects, reproducibility, metascience, etc. present themselves to hiring committees successfully.Creating a resource/guide on how to present yourself and your open science work. To be posted on the SIPS website.Recommendations to job applicants: osf.io/rw8v9
9
Gather data on current hiring practicesosf.io/atnhxKatie Corker (k.corker@gmail.com)Understand more about current hiring practices and (potentially) assess how changes in SIPS hiring initiatives changes the culture of hiring over time. 1. Collecting data on current practices, policies, etc.

2. (Potentially in the future): Pre and post data on how implementing these changes affects who gets hired, what they do after they're hired, other outcomes, etc.
10
Overview paper on replicationosf.io/ed6gvRich Lucas (richard.e.lucas@gmail.com)Integrate different points of view around replication (i.e. pros and cons) into one report.1. Write and publish an overview paper engaging with current controversies regarding replications and their interpretationsOutline of paper on the osf project wiki: osf.io/ed6gv/wiki/home
11
"Craiglist"-like platform for study swapping (e.g. Study Swap)osf.io/9aj5gBrent Donnellan (mbdonnellan@tamu.edu), Ellen Evers (evers@haas.berkeley.edu), Stephen Williams (stephen@cos.io), Rolf Zwaan (rolfzwaan@gmail.com), Randy McCarthy (rmccarthy3@niu.edu), Christopher Chartier (cchartie@ashland.edu), Hans IJzerman (h.ijzerman@gmail.com), Kiley Hamlin (kiley.hamlin@psych.ubc.ca)To make replications more accessible and logistically possible, Study Swap provides a tool to encourage collaborative replication. The idea is that a researchers will either post a replication 'need' or 'have' on an online platform (the osf project supported by the Many Lab), and then ultimately will exchange replications. 1. Build OSF project to house "Craigslist" type exchanges

2. Pilot study for study swapping
OSF project mockup: osf.io/9aj5g
12
White paper on how to evaluate quality of evidence when reviewing a paperosf.io/nuhwaTal Yarkoni (tyarkoni@utexas.edu)Establish and disseminate a set of general and specific guidelines that reviewers follow when evaluating a manuscript. 1. write white paper outlining how to appropriately evaluate a manuscript (based on quality of methods, etc.).

2. Include a checklist that could potentially be a requirement for all reviewers before sending in their reviews to editors.
13
Demonstrate positive change of OS movement, track opinions concerning problems and solutionsosf.io/nuhwaBrett Buttliere (Brettbuttliere@gmail.com)Follow-up on a survey that assessed current attitudes and behavior revolving around open science issues. 1. Survey the current opinions about open science from Study 1 sample

2. Assess changes in attitudes and behavior from 2013 (Study 1) to now
Report of Study 1: osf.io/kpiq3
14
Stat-Check at article submissionosf.io/6t4gcStephen Lindsay slindsay@uvic.caRunning a mini pilot at Psychological Science of using StatCheck on manuscripts that have received external review and were not immediately rejected after the first round of reviews. The goal is to assess the viability of using StatsCheck as an error-flagging mechanism.Contact Michelle Nuijten re using stat-check for submission
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
Loading...
 
 
 
Sheet1