Blocking Twitter users from R

Last year, I undertook a brief analysis of abusive tweets directed at Canada’s (then) Minister of the Environment and Climate Change, Catherine McKenna. Twitter can be a really great platform for the exchange of information, but it can also be a cesspool of hate and negativity. I’m not an important person, so I can usually avoid the bots and the trolls fairly easily (with the exception of the occasional dust-up with climate change deniers). However, many more important climate scientists, and of course, the political class, are less likely to avoid abusive replies.

While I am totally on board with the right to free expression, I am also a proponent of the right to not have to listen to jerks. I was curious to see whether I could use R to find and block people that I don’t have any interest in hearing from on Twitter.[^Whether they be be “bots”, or just humans with whom I have no desire to interact.] I will accomplish this goal with the help of the awesome rtweet package, from the rOpenSci community. I’ll also use a few packages from the tidyverse, and httr.

You can do lots of things with rtweet without needing Twitter developer credentials, however, we’re going to need to perform some “post” actions, which means that we will require our own login token. I won’t go into the details of obtaining these details and creating the token, because the rtweet authors have already done a great job of documenting those steps. Check out the “auth” vignette, by typing:

vignette("auth")

Once you have followed the instructions in the “auth” vignette, you should have a working token that will allow you to use the post endpoints of the Twitter API to modify your account.

token <- get_token()

Identifying users to block

There are surely myriad ways of identifying antagonistic users on Twitter, but an easy way to find them is to look at popular hashtags. This isn’t a perfect solution. For instance, a search for “#maga” turn up a lot of folks who have “reclaimed” the hashtag to tick off Trump supporters, so, you could end up blocking folks you could be interested in hearing from. In this blog post, we’ll use a slightly more niche hashtag, based on last year’s look into Catherine McKenna’s Twitter experience. It seems like her new portfolio in infrastructure has not brought relief from jerks on Twitter. She is still the target of consistent pestering on Twitter in the form of the “#wheresthemoneycatherine” hashtag.

Let’s search for tweets using our hashtag.

tweets <- search_tweets("#wheresthemoneycatherine", n = 1000, include_rts = FALSE,
                        verbose = FALSE)

The search_tweets() function uses Twitter’s free search API endpoint, so it only returns data from the last 7 days. It can also return up to 18,000 statuses, but I have opted to limit my search to 1000. There is a lot of information returned by the API.

Click here for a sample of the data.
glimpse(tweets)

#> Rows: 462
#> Columns: 90
#> $ user_id                 <chr> "23505313", "2817035628", "2817035628", "2817…
#> $ status_id               <chr> "1315663302102376448", "1315662762765414400",…
#> $ created_at              <dttm> 2020-10-12 14:39:09, 2020-10-12 14:37:01, 20…
#> $ screen_name             <chr> "SwimCoachClint", "overthemoonmark", "overthe…
#> $ text                    <chr> "@cathmckenna Today is Monday, October 12, 20…
#> $ source                  <chr> "Twitter for Android", "Twitter for iPhone", …
#> $ display_text_width      <dbl> 238, 54, 54, 54, 213, 183, 105, 213, 274, 280…
#> $ reply_to_status_id      <chr> "1315384462239244294", "1315407437336252416",…
#> $ reply_to_user_id        <chr> "140252240", "1199445363691646976", "94322696…
#> $ reply_to_screen_name    <chr> "cathmckenna", "Krisster81", "will_bouw", "Em…
#> $ is_quote                <lgl> FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL…
#> $ is_retweet              <lgl> FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL…
#> $ favorite_count          <int> 0, 0, 0, 0, 4, 7, 0, 0, 0, 1, 3, 0, 0, 0, 2, …
#> $ retweet_count           <int> 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 3, …
#> $ quote_count             <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ reply_count             <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ hashtags                <list> ["WheresTheMoneyCatherine", <"TrudeauCorrupt…
#> $ symbols                 <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ urls_url                <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ urls_t.co               <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ urls_expanded_url       <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ media_url               <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ media_t.co              <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ media_expanded_url      <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ media_type              <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ ext_media_url           <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ ext_media_t.co          <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ ext_media_expanded_url  <list> [NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA,…
#> $ ext_media_type          <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ mentions_user_id        <list> [<"140252240", "140252240">, <"1199445363691…
#> $ mentions_screen_name    <list> [<"cathmckenna", "cathmckenna">, <"Krisster8…
#> $ lang                    <chr> "en", "und", "und", "und", "en", "en", "und",…
#> $ quoted_status_id        <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_text             <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_created_at       <dttm> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, …
#> $ quoted_source           <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_favorite_count   <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_retweet_count    <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_user_id          <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_screen_name      <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_name             <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_followers_count  <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_friends_count    <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_statuses_count   <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_location         <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_description      <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ quoted_verified         <lgl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_status_id       <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_text            <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_created_at      <dttm> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, …
#> $ retweet_source          <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_favorite_count  <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_retweet_count   <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_user_id         <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_screen_name     <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_name            <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_followers_count <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_friends_count   <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_statuses_count  <int> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_location        <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_description     <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ retweet_verified        <lgl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ place_url               <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ place_name              <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ place_full_name         <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ place_type              <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ country                 <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ country_code            <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ geo_coords              <list> [<NA, NA>, <NA, NA>, <NA, NA>, <NA, NA>, <NA…
#> $ coords_coords           <list> [<NA, NA>, <NA, NA>, <NA, NA>, <NA, NA>, <NA…
#> $ bbox_coords             <list> [<NA, NA, NA, NA, NA, NA, NA, NA>, <NA, NA, …
#> $ status_url              <chr> "https://twitter.com/SwimCoachClint/status/13…
#> $ name                    <chr> "Coach Clint", "Noodles", "Noodles", "Noodles…
#> $ location                <chr> "Canada", "Toronto, Ontario", "Toronto, Ontar…
#> $ description             <chr> "Have strong opinions. I am in protest of my …
#> $ url                     <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ protected               <lgl> FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL…
#> $ followers_count         <int> 552, 78, 78, 78, 78, 78, 78, 78, 78, 78, 78, …
#> $ friends_count           <int> 962, 153, 153, 153, 153, 153, 153, 153, 153, …
#> $ listed_count            <int> 9, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, …
#> $ statuses_count          <int> 2233, 8006, 8006, 8006, 8006, 8006, 8006, 800…
#> $ favourites_count        <int> 1182, 6998, 6998, 6998, 6998, 6998, 6998, 699…
#> $ account_created_at      <dttm> 2009-03-09 21:30:46, 2014-09-18 13:48:02, 20…
#> $ verified                <lgl> FALSE, FALSE, FALSE, FALSE, FALSE, FALSE, FAL…
#> $ profile_url             <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ profile_expanded_url    <chr> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ account_lang            <lgl> NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, NA, N…
#> $ profile_banner_url      <chr> "https://pbs.twimg.com/profile_banners/235053…
#> $ profile_background_url  <chr> "http://abs.twimg.com/images/themes/theme1/bg…
#> $ profile_image_url       <chr> "http://pbs.twimg.com/profile_images/13006486…

So, are any of these users actually bots? Since we know that the API only returns data from the last 7 days, we might assume that any user that is repeating the same hashtag multiple times in that period is a bot. Let’s count the number of instances that each account uses the hashtag.

tweets %>%
  group_by(screen_name, user_id) %>%
  summarize(n = n(), .groups = "drop") %>%
  arrange(desc(n)) -> counts

counts

#> # A tibble: 220 x 3
#>    screen_name     user_id                 n
#>    <chr>           <chr>               <int>
#>  1 wilson_the_dogg 3380224119             34
#>  2 overthemoonmark 2817035628             32
#>  3 OttawaGordon    278770541              31
#>  4 Dphitchcock     815602484123144193      8
#>  5 KRS61           292091883               8
#>  6 BlushingBelles  1228719819974832128     7
#>  7 don_unka        1090381795256864768     6
#>  8 RlHilts         2850783528              6
#>  9 spock246        1045546682              6
#> 10 bkruhlak        444269422               5
#> # … with 210 more rows
ggplot(counts, aes(x = n)) +
  geom_density() +
  geom_vline(aes(xintercept=median(n)),
            colour = "blue", linetype = "dashed") +
  geom_vline(aes(xintercept=quantile(n, 0.95)),
            colour = "red", linetype = "dashed") +
  ggtitle("Density of number of tweets containing \"wheresthemoneycatherine\" by user",
          subtitle = paste("Median (blue) =", median(counts$n),
                           "/ 95th percentile (red) =", round(quantile(counts$n, 0.95), 3)))

We can see that the vast majority of tweeters used the hashtag only once in our search history, and almost all of them could count the number of times they tweeted the hashtag on one hand. However, a handful of users used the hashtag dozens of times.

It might be enough to assume that a user that is using the same hashtag dozens of times in the space of a week is a bot, or at least an agitator, but if you want to be really sure before blocking the user, you can try a bot-checking API.

Optional: Using a bot-checking API

Botometer is a tool to check whether a Twitter user is likely to be a bot, or just a regular user. Botometer offers an API, which has been implemented in the botcheck package. botcheck has not been updated for a number of years. I’ll admit that I didn’t try the package, and it may still work, but I decided to emulate the functionality of the package in this post.

First, we will need an API key for the Botometer API. Sign up for a RapidAPI account and subscribe (for free) to the Botometer API, and you will be able to see your key in the example code provided on the endpoints page.

I have saved my RapidAPI key to an environment variable called RAPIDAPI_KEY.

RAPIDAPI_KEY = Sys.getenv("RAPIDAPI_KEY")

Now let’s reproduce the botcheck() function from the botcheck package. The Botometer API asks for request body that contains user information, the user’s timeline, and tweets that mention the user. Let’s collect those details from Twitter. For this example, we can use our top-ranked tweeter, who happens to be wilson_the_dogg, with user ID 3380224119, who tweeted the hashtag 34 times.

Request Twitter data

The rtweet package doesn’t seem to have a function that accesses the users/show endpoint, and the other functions convert the JSON to an R object. When I converted to information back to JSON, I couldn’t get it to play nice with the Botometer API. Rather than using rtweet here, we’ll create our own little function to request the necessary data from Twitter, based on the botcheck() function.

twitter_data <- function(type, screen_name, user_id, token) {
  url <- switch(type,
                "user" = sprintf("https://api.twitter.com/1.1/users/show.json?screen_name=%s&user_id=%s&count=200",
                                 screen_name, user_id),
                "timeline" = sprintf("https://api.twitter.com/1.1/statuses/user_timeline.json?screen_name=%s&user_id=%s&count=200&include_rts=true",
                                     screen_name, user_id),
                "mentions" = sprintf("https://api.twitter.com/1.1/search/tweets.json?q=%%40%s&count=200", screen_name))
  r <- GET(url, token)
  if (r$status_code == 200) {
    content(r, type = "application/json")
  } else {
    stop("Status was not 200")
  }
}

Collect user data.

user <- twitter_data(type = "user", screen_name = counts$screen_name[1],
                     user_id = counts$user_id[1], token)

Collect the user’s timeline.

timeline <- twitter_data(type = "timeline", screen_name = counts$screen_name[1],
                         user_id = counts$user_id[1], token)

Collect tweets mentioning the user.

mentions <- twitter_data(type = "mentions", screen_name = counts$screen_name[1],
                         user_id = counts$user_id[1], token)

Making the request

Now we can put this all together into a body object.

body <- list(user = user,
             timeline = timeline,
             mentions = mentions)

Next we make the Botometer API request.

result <-  POST("https://rapidapi.p.rapidapi.com/4/check_account",
                encode = "json",
                add_headers("X-RapidAPI-Host" = "botometer-pro.p.rapidapi.com",
                            "X-RapidAPI-Key" = RAPIDAPI_KEY,
                            useQueryString = TRUE),
                body = jsonlite::toJSON(body, null = "null", digits = 50,
                                        auto_unbox = TRUE))

Check the user’s score.

out <- content(result, "parsed", encoding = "UTF-8")
out$display_scores$english$overall

#> [1] 1.4

Botometer ranks accounts on a scale of 1 to 5, with higher scores indicating a higher likelihood of being a bot. With a score of 1.4, it looks like @wilson_the_dogg isn’t a bot after all, just someone who feels the need to tweet a hashtag dozens of times in the space of a week! In fact, at the time of writing this post, Bot Sentinel (which identifies malicious users, not just bots) had classed this user as “ disruptive". Note, a Bot Sentinel API is marked as “coming soon”.

We can check the second user as well. We can stick all of the requests into one call (usually a bad idea).

result <-  POST("https://rapidapi.p.rapidapi.com/4/check_account",
                encode = "json",
                add_headers("X-RapidAPI-Host" = "botometer-pro.p.rapidapi.com",
                            "X-RapidAPI-Key" = RAPIDAPI_KEY,
                            useQueryString = TRUE),
                body = jsonlite::toJSON(lapply(c(user = "user", timeline = "timeline",
                                                 mentions = "mentions"),
                                               twitter_data, counts$screen_name[2],
                                               counts$user_id[2], token),
                                        null = "null", digits = 50,
                                        auto_unbox = TRUE))

Let’s see the user’s score.

out <- content(result, "parsed", encoding = "UTF-8")
out$display_scores$english$overall

#> [1] 1.6

It looks like @overthemoonmark is even less of a bot, but Bot Sentinel thinks that the user has “ questionable” activity.

Blocking users

Bots or not, can we block these users? rtweet includes functions to mute users ( post_mute()), but there doesn’t seem to be any function to block users. Let’s write one, borrowing some of the internal functions from rtweet (though we could just as easily use the POST function from httr as we did for the Botometer API).

block_user <- function(screen_name, user_id, token = NULL) {
  ## The rtweet authors turn off scientific notation so that 
  ## R doesn't misrepresent the user_id strings.
  op_enc <- getOption("encoding")
  op_sci <- getOption("scipen")
  on.exit(options(sencodingcipen = op_sci,
                  encoding = op_enc), add = TRUE)
  options(scipen = 14, encoding = "UTF-8")

  query <- "blocks/create"
  token <- rtweet:::check_token(token)
  get <- FALSE
  params <- list(screen_name = screen_name, user_id = user_id)
  url <- rtweet:::make_url(query = query, param = params)
  resp <- rtweet:::TWIT(get = get, url, token)
  rtweet:::from_js(resp)
}

Let’s look at users that have tweeted the hashtag more than a dozen times, and block them.

counts %>%
  filter(n > 12) -> highn

highn

#> # A tibble: 3 x 3
#>   screen_name     user_id        n
#>   <chr>           <chr>      <int>
#> 1 wilson_the_dogg 3380224119    34
#> 2 overthemoonmark 2817035628    32
#> 3 OttawaGordon    278770541     31

Now we can block some of these users! Our function only works on one user at a time, but we lazily vectorize it using Vectorize(). The Twitter API users a user information object for the API calls to the blocks/create endpoint. I’m not interested in printing all of this information in this blog post, so I’ll store it into an R object, but this isn’t technically necessary.

block_user_v <- Vectorize(block_user)
rs <- block_user_v(highn$screen_name, highn$user_id)

If all of our requests succeeded, then I won’t see any tweets from @wilson_the_dogg, @overthemoonmark, or @OttawaGordon anymore!

Now we know how we can find and block antagonistic Twitter users from R. This is probably useful if you want to occasionally check and block users, but for more permanent protection, you might consider a tool like Bot Sentinel’s autoblocker.


This post was compiled on 2020-10-12 14:42:24. Since that time, there may have been changes to the packages that were used in this post. If you can no longer use this code, please notify the author in the comments below.

Packages Used in this post
sessioninfo::package_info(dependencies = "Depends")

#>  package     * version    date       lib source                         
#>  askpass       1.1        2019-01-13 [1] RSPM (R 4.0.0)                 
#>  assertthat    0.2.1      2019-03-21 [1] RSPM (R 4.0.0)                 
#>  cli           2.0.2      2020-02-28 [1] RSPM (R 4.0.0)                 
#>  colorspace    1.4-1      2019-03-18 [1] RSPM (R 4.0.0)                 
#>  crayon        1.3.4      2017-09-16 [1] RSPM (R 4.0.0)                 
#>  curl          4.3        2019-12-02 [1] RSPM (R 4.0.0)                 
#>  digest        0.6.25     2020-02-23 [1] RSPM (R 4.0.0)                 
#>  downlit       0.2.0      2020-09-25 [1] RSPM (R 4.0.2)                 
#>  dplyr       * 1.0.2      2020-08-18 [1] RSPM (R 4.0.2)                 
#>  ellipsis      0.3.1      2020-05-15 [1] RSPM (R 4.0.0)                 
#>  evaluate      0.14       2019-05-28 [1] RSPM (R 4.0.0)                 
#>  fansi         0.4.1      2020-01-08 [1] RSPM (R 4.0.0)                 
#>  farver        2.0.3      2020-01-16 [1] RSPM (R 4.0.0)                 
#>  fs            1.5.0      2020-07-31 [1] RSPM (R 4.0.2)                 
#>  generics      0.0.2      2018-11-29 [1] RSPM (R 4.0.0)                 
#>  ggplot2     * 3.3.2      2020-06-19 [1] RSPM (R 4.0.1)                 
#>  glue          1.4.2      2020-08-27 [1] RSPM (R 4.0.2)                 
#>  gtable        0.3.0      2019-03-25 [1] RSPM (R 4.0.0)                 
#>  htmltools     0.5.0      2020-06-16 [1] RSPM (R 4.0.1)                 
#>  httr        * 1.4.2      2020-07-20 [1] RSPM (R 4.0.2)                 
#>  hugodown      0.0.0.9000 2020-10-08 [1] Github (r-lib/hugodown@18911fc)
#>  jsonlite      1.7.1      2020-09-07 [1] RSPM (R 4.0.2)                 
#>  knitr         1.30       2020-09-22 [1] RSPM (R 4.0.2)                 
#>  labeling      0.3        2014-08-23 [1] RSPM (R 4.0.0)                 
#>  lifecycle     0.2.0      2020-03-06 [1] RSPM (R 4.0.0)                 
#>  magrittr      1.5        2014-11-22 [1] RSPM (R 4.0.0)                 
#>  munsell       0.5.0      2018-06-12 [1] RSPM (R 4.0.0)                 
#>  openssl       1.4.3      2020-09-18 [1] RSPM (R 4.0.2)                 
#>  pillar        1.4.6      2020-07-10 [1] RSPM (R 4.0.2)                 
#>  pkgconfig     2.0.3      2019-09-22 [1] RSPM (R 4.0.0)                 
#>  purrr         0.3.4      2020-04-17 [1] RSPM (R 4.0.0)                 
#>  R6            2.4.1      2019-11-12 [1] RSPM (R 4.0.0)                 
#>  rlang         0.4.7      2020-07-09 [1] RSPM (R 4.0.2)                 
#>  rmarkdown     2.3        2020-06-18 [1] RSPM (R 4.0.1)                 
#>  rtweet      * 0.7.0      2020-01-08 [1] RSPM (R 4.0.0)                 
#>  scales        1.1.1      2020-05-11 [1] RSPM (R 4.0.0)                 
#>  sessioninfo   1.1.1      2018-11-05 [1] RSPM (R 4.0.0)                 
#>  stringi       1.5.3      2020-09-09 [1] RSPM (R 4.0.2)                 
#>  stringr       1.4.0      2019-02-10 [1] RSPM (R 4.0.0)                 
#>  tibble      * 3.0.3      2020-07-10 [1] RSPM (R 4.0.2)                 
#>  tidyselect    1.1.0      2020-05-11 [1] RSPM (R 4.0.0)                 
#>  utf8          1.1.4      2018-05-24 [1] RSPM (R 4.0.0)                 
#>  vctrs         0.3.4      2020-08-29 [1] RSPM (R 4.0.2)                 
#>  withr         2.3.0      2020-09-22 [1] RSPM (R 4.0.2)                 
#>  xfun          0.18       2020-09-29 [2] RSPM (R 4.0.2)                 
#>  yaml          2.2.1      2020-02-01 [1] RSPM (R 4.0.0)                 
#> 
#> [1] /home/conor/Library
#> [2] /usr/local/lib/R/site-library
#> [3] /usr/local/lib/R/library
Avatar
Conor I. Anderson, PhD
Alumnus, Climate Lab

Conor is a recent PhD graduate from the Department of Physical and Environmental Sciences at the University of Toronto Scarborough (UTSC).

Related