In the same vein as for https://discourse-testing.openhumans.org/t/re-approval-for-project-opensnp/18/3 I’d like to ask for a reapproval for the Twitter archive analyser.
Should this project be visible and available for all Open Humans members to join?
Please vote Approve or Deny, and/or comment.
Comments: This project was previously approved, but has been suspended by its coordinator and requests re-review before resuming. The project adds Twitter archives (which a user downloads) to their Open Humans accounts, and has no other private data access.
- Activity page: https://www.openhumans.org/activity/twitter-archive-analyzer/
- Project review guide: Project Review Guide
- Project guidelines: https://www.openhumans.org/community-guidelines/#project
- Title: Twitter Archive Analyzer
- Managed by: Bastian Greshake Tzovaras
- Description: Twitter archives are a rich source of data for doing research into numerous things: Learning about social media and interaction networks, gaining insights into movement patterns based on geolocations and even doing sentiment analysis based on the tweets. And the best part of it: Unless you have a protected Twitter account this data is already public. So why not share it? The TwArχiv takes in your Twitter archive and generates interesting visualizations from your own tweets, including tweet volume over time and your interaction/movement patterns.
- Project website: https://twtr-analyser.herokuapp.com/
- Connections: 274 members (6 with public data)
- Data received: None
- Data added: Zipped Twitter archive (retrieved and uploaded by user)
Data is submitted manually by the user, no API calls
The visualizations are very interesting
Data is submitted manually by the user, requiring thoughtful curation by the user to download and then upload to the analyzer
I am a volunteer (developer) for Open Humans and I do use this integration myself.
Because @gedankenstuecke would like to highlight this project in an upcoming talk, I hope to make a final approve/deny decision by Tuesday 3pm Pacific.
Approval with caveat (Apologies for creating a new vote type.) I have one major concern I’d like to see addressed. However, because (1) the project’s data is not particularly sensitive, (2) because it has already been active & used by a couple hundred members, and (3) I think the solution may involve work on Open Humans’ part (rather than this project) … I think it is in the community’s interest to see this project operating again, with the understanding that the concern will be addressed in coming weeks.
- Community Guidelines: Good – I tend to trust project operators on this, but it seems like this is following guidelines. I appreciate that it’s open source!
- Clarity in communication: Mediocre – Here is the caveat mentioned above. I feel very apologetic in raising this concern, but in interactions with third party projects, a member is deciding whether to trust the entity operating that project. So I think it’s very important that the identity of who/what is running a project be clear. It’s especially important that it doesn’t appear to be Open Humans itself. This project does a great job with its “About” page, but I still have a concern: this project (and some others) is visually very similar to some data sources that are operated by Open Humans Foundation proper – 23andMe upload and AncestryDNA upload. I don’t think this should block re-approval, but I’d like to see this issue improved in coming weeks. (More on this below.)
- Security/privacy: Good – security concerns aren’t very high here, as the data added here is fairly public from the start, assuming a user’s Twitter account is public. @beau had some suggestions for Django security settings in (re)approval for Genevieve Genome Report that might be useful.
Caveat & solutions: The style/layout similarities between this and the projects operated by Open Humans are an unfortunate consequence of our open source approach in creating/sharing/re-using the underlying web app code.
The main solution I think is needed here is on the Open Humans end: projects operated by Open Humans should be redesigned to have a distinct visual style (using OH colors & styles). This would address my largest concern about “confusion with Open Humans”.
(I think it’s also important to not see non-OH entities confused with each other. For that, I think more a unique style/layout for this project would also be helpful.)
Disclosure: @gedankenstuecke is Director of Research of Open Humans Foundation – as Executive Director, I feel I have some conflict-of-interest in reviewing an independent project of a member of staff. Also, I’ve used this integration myself.
Thanks to @Benc for contributing a review, as well as feedback in Slack!
This feedback, combined with my own review, supports (re)approval of the project.
The main caveat is that this project needs to be better communicate itself as distinct from projects operated by Open Humans. However, it seems the primary solution for this is to more distinctively style projects operated by Open Humans, rather than amend this project. (That should occur in the web apps, and might also occur in the main site.) This project is fine to operate in the meantime, in particular because it has low data sensitivity, a clear “about” page, and was already in use by a couple hundred members.