Developed by Yijun Lin, Babak Hemmatian, Haotian Wang, and Naman Raina.
We present a new experiment platform to address this gap by allowing controlled study of human-AI and human-human teams in a creative task. We have created a cooperative version of the classic Alternate Uses Test (AUT; Guilford, 1967), where the goal is to produce as many original and practical creative uses for an everyday object as possible within a time limit. Our platform allows co-players to interact freely during the ideation stage before choosing their personal responses during the curation step. Our original webapp allows identical procedures to be used for human-human and human-AI pairs and experimental controls to be applied to the chat. The results can be evaluated using the same procedures as in the standard individual test of creativity. We currently use GPT-4 as the AI agent, but the platform’s modularity allows us to replace it with more or less advanced algorithms as needed.
An emerging consensus in the cognitive sciences states that flexible, adaptive behavior (i.e., intelligence) does not come from individuals alone, but rather often reflects the competent incorporation of knowledge and skills from one’s community (Sloman & Fernbach, 2018). For instance, research identifies the ability to successfully coordinate with one’s team members, called their collective intelligence (CI), as a much better predictor of group outcomes than individual IQs (Riedl et al., 2022). Although creativity is an increasingly important manifestation of intelligence in the information economy, creativity research has not kept up with this collective shift in the cognitive sciences.
- Open a browser - Google Chrome, Microsoft Edge, or Mozilla Firefox.
- Take the Qualtrics survey in a separate tab. Please make sure that paired participants take the same survey to avoid timing mismatches.
- Once you reach the second page of the survey, please take at least 1 minute to read the instructions as the right arrow will not appear until one minute has passed.
-
For the Pre-Test the next page will require you to spend two minutes writing up the creative uses of the assigned item (in this case it is a safety pin). For the Post-Test the next page you will see is the instructions page that you should read before you head over to the chat app.
Pre Test Post Test
-
For the Pre-Test you will see the following page. Please read the instructions carefully and use this link for the chat app. For the Post-Test, after you are done with Round 1, please spend 5 minutes writing in detail to answer the prompt below.
Pre Test Post Test
- For the Pre-Test, once you are done with Round 1, please spend 5 minutes writing in detail to answer the prompt below. For the Post-Test, please follow the instructions below and begin matching again for the next round.
Pre Test Post Test
-
For the Pre-Test, please follow the instructions below and begin matching again for the next round. The above steps need to be repeated for both Pre/Post-Test until the end of the third round.
-
The rest of the survey has very clear instructions and is easy to follow - if there are any questions please see the FAQs section.
- Create two new tabs.
- In both, head over to the chat app.
- Please login or register with two different profiles when the login page appears. Also, make sure the usernames and avatars chosen are easy to identify.
- Once the chat app page shows up, select 'match' in the top-left corner in both tabs.
- Either use
The link is here.
-
Head over to the chat app.
-
You will be prompted to the following page:
-
Enter your information (note that registration requires a password confirmation).
-
Choose an avatar and type in your name as shown below:
-
You will see this screen (do refresh so that you can see the avatar on the top right). Click on "match".
-
You will be directed to the chat screen. Type "ready" to get started.
Please refresh the page and the chat app will restart from the round (Human, Interactive AI, or Constant) you were currently in. If the problem persists, please log out and then log in again.
Please contact [email protected].