Deploying R Models with MLflow and Docker

Deploying R Models with MLflow and Docker

MLflow is a platform for the “machine learning cycle”. It’s a suite of tools for managing models, with tracking of hyperparameters and metrics, a registry of models, and options for serving. It’s this last bit that I’m going to focus on today.

I haven’t been able to find much discussion or documentation about MLflow’s support for R. There’s the RStudio MLflow example, but I wanted to see if I could use MLflow to serve something more complex. I’m going to use the crate MLflow flavour along with Docker to see if MLflow can be used to serve R models with preprocessing and prediction pipelines that are compositions of multiple functions, which is the standard for R.

MLflow serves every model as an API, and it’s an approach that I like. I can imagine serving multiple models simultaneously and querying them with a common dataset to compare performance.

I need to stress that MLflow isn’t just for serving models — one of its major appeals is the logging of hyperparameters and metrics in a model registry, along with a beautiful UI. I’m ignoring those components here, but that doesn’t mean they’re not important.

Packaging models with crate

MLflow serves models through “flavours”, which usually correspond to machine learning frameworks. In Python there are scikit-learn, TensorFlow, and PyTorch flavours, amongst others. In R there are just two: keras and crate. I’m not particularly interested in deep learning, so I’ll focus on crate.

crate is a function provided by the carrier package. It allows the packaging of R functions so that they can be sent off to a different R processes. It’s easy to see why this would be useful for serving machine learning models, since the goal is to package up a machine learning model and deploy it in some other environment.

Let’s take a look at packaging up a simple linear regression with crate:

starwars_height_lm <- lm(height ~ mass, data = dplyr::starwars)
packaged_starwars_height_lm <- carrier::crate(
    function(x) stats::predict.lm(starwars_height_lm),
    starwars_height_lm = starwars_height_lm
)

A crate function call consists of a main function, which has to be “freshly” defined within the call, along with a list of objects that accompany the function. I can serialise this packaged_starwars_height_lm crate and move it to another R process, and the linear model I trained will move along with it. Serialising in MLflow is done with the S3 generic mlflow::mlflow_save_model.

A couple of things to note here: I have to be very explicit about how I use functions in crate. Just typing predict wouldn’t do here: I have to use the specific predict.lm method for linear models. I also have to declare that it’s from the stats package. According to the help file, the accompanying objects will be automatically named after themselves if no name is provided, but I haven’t found this to be true.

The problem

There are no package dependencies in the above linear model (well, there’s stats, but that’s always going to be available) so it will work out of the box in any R process. For any “real life” model, there will be dependencies. In particular, I almost always work with a package workflow. My modelling helper functions are contained within a package dedicated to that one model. Each of those functions is a dependency that has to be included in the crated function.

I’ll use a simple example. The below won’t work, because I haven’t given crate the three accompanying functions it needs:

triple <- function(x) 3*x
square <- function(x) x**2
triplesquare <- function(x) triple(square(x))
fn <- carrier::crate(function(x) triplesquare(x))
fn(2)
#> Error in triplesquare(x): could not find function "triplesquare"

If I provide the three functions, everything works:

fn <- carrier::crate(
  function(x) triplesquare(x),
  triplesquare = triplesquare,
  square = square,
  triple = triple)
fn(2)
#> [1] 12

But then, if I delete the functions from the global environment, the crated function no longer works:

rm(triple, square, triplesquare)
fn(2)
#> Error in triple(square(x)): could not find function "triple"

I need to be able to provide these functions to crate in a way that they can be carried along with the crated function somehow.

Option 1: Install the package

If I’m using a package workflow, then the obvious solution is to install the package. I’ll be using my usual “ReviewSentiment” model as an example, here as a package called ReviewSentimentMLflow. This package trains a random forest model that predicts the sentiment of brief product reviews. The random forest has three artefacts: a review_rf model object, along with vectoriser and tfidf objects for preprocessing. I can crate all of this up along with my sentiment predict function as below. Note the explicit mention of the ReviewSentimentMLflow namespace, which is required if I’m installing the package:

crated_model = carrier::crate(
  function(review) { # Function must be "fresh", ie. not pre-defined
    ReviewSentimentMLflow::sentiment(review, review_rf, vectoriser, tfidf)
  },
  review_rf = review_rf,
  vectoriser = vectoriser,
  tfidf = tfidf
)

The objects I specify — review_rf, vectoriser and tfidf — are not part of the package. They are model artefacts generated during training. crate can handle those as is.

That sentiment function is where the problem lies. It’s one of the functions in my package. It uses the vectoriser and tfidf to process text into a format that can be handled by the random forest predictor. It calls on other package functions to do this, and it’s these underlying dependencies that will cause issues with crate.

I can serialise crated_model with mlflow_save_model, and then everything can be exported to another platform. There are a few helper functions used in the definition of sentiment, but as long as that ReviewSentimentMLflow package is installed on that platform I can serve the model with MLflow using the terminal command mlflow models serve.

But the packages I use for my models are highly specific to a certain dataset and use case; they certainly aren’t going on CRAN. The underlying motivation here is that I want to be able to execute my model on a machine that isn’t my computer, so that means I need to be able to move my package along with the crated_model.

Accept that everything will be put in a container eventually

Docker containers are the go-to solution for reproducibility. The idea is straightforward: put the crated model and all dependencies into a container, so that everything moves as one. There’s ample support for running containers, especially on the cloud, and containerisation is arguably the gold standard for reproducible workflows.

I first learned how to use R with Docker at useR 2018, but getting everything incorporating everything into a Dockerfile was a real challenge. This is what worked in the end:

FROM rocker/r-ver:4.0.0
ENV RENV_VERSION 0.10.0
ENV CRAN_REPO https://packagemanager.rstudio.com/all/__linux__/focal/latest
ENV MINICONDA_INSTALLER Miniconda3-py38_4.8.3-Linux-x86_64.sh
ENV LISTENING_HOST 0.0.0.0
ENV LISTENING_PORT 5000
# Copy the entirety of the context into the image. This should be the R package source.
ADD . / /model-package/
WORKDIR /model-package

# Install system dependencies. I couldn't get sysreqs to work here, since python-minimal
# isn't available on this implementation of rocker. Curl is required to download Miniconda.
RUN apt-get -y update && \
    apt-get install -y curl libgit2-dev libssl-dev zlib1g-dev \
    pandoc pandoc-citeproc make libxml2-dev libgmp-dev libgfortran4 \
    libcurl4-openssl-dev libssh2-1-dev libglpk-dev git-core

# renv::restore can be a bit buggy if .Rprofile and the renv directory exist
RUN rm -f .Rprofile
RUN rm -rf renv
RUN Rscript -e "install.packages('remotes', repos = c(CRAN = Sys.getenv('CRAN_REPO')))"
RUN Rscript -e "remotes::install_github('rstudio/renv', ref = Sys.getenv('RENV_VERSION'))"
RUN Rscript -e "renv::restore(repos = c(CRAN = Sys.getenv('CRAN_REPO')))"

# Install miniconda to /miniconda and install mlflow
RUN curl -LO https://repo.anaconda.com/miniconda/$MINICONDA_INSTALLER
RUN bash $MINICONDA_INSTALLER -p /miniconda -b
RUN rm $MINICONDA_INSTALLER
ENV PATH=/miniconda/bin:${PATH}
RUN pip install mlflow
ENV MLFLOW_BIN /miniconda/bin/mlflow
ENV MLFLOW_PYTHON_BIN /miniconda/bin/python

RUN Rscript -e "if (!require('devtools')) install.packages('devtools', repos = Sys.getenv('CRAN_REPO'))"
RUN Rscript -e "devtools::install(dependencies = FALSE)"

ENTRYPOINT ["/usr/bin/env"]
CMD mlflow models serve -m artefacts/model --host $LISTENING_HOST --port $LISTENING_PORT

The flow of this Dockerfile is:

  1. Start with R (thanks to the Rocker project)
  2. Set some environment variables to guide reproducibility
  3. Copy the entire model package source code into the image (which contains the trained model artefacts, including the crated model)
  4. Install system dependencies with apt-get
  5. Install R package dependencies with renv
  6. Install Miniconda (a minimal version of Anaconda)
  7. Install the Python MLflow module and configure its environment variables (required to run MLflow, even in R)
  8. Install the model package
  9. Serve the model with mlflow

I’ve used renv to lock down the package versions. I’m also using the RStudio Package Manager to download binaries instead of source code, which greatly reduces the package install time.

To build the image, I navigate to the directory containing the package code and run the following in a terminal:

Docker build --tag review-sentiment .

The build process will take some time, as it has to pull in all of the packages recorded in the renv lockfile. The resulting image is 2.5GB, which is disappointing given that the model artefacts (including the random forest) are altogether under a megabyte when compressed. A Docker guru could no doubt bring this size down, but there is a storage penalty for exporting an entire environment in which to run a model.

To run the model, I enter the following command at a terminal:

Docker run -p 5000:5000 review-sentiment

MLflow serves models as APIs, so I can query this model with curl:

curl -X POST "http://127.0.0.1:5000/predict/" -H  "accept: application/json" -H  "Content-Type: application/json" -d "\"love\""
# "good"

This is a highly portable way of exporting a model. Actually, it doesn’t matter too much how the the model is served here — a model exposed with the plumber package would work just as well.

I think this approach betrays the objective of MLflow. I already have an exported model object, and it’s reasonable to expect that the model object should work as is on any other machine. I can understand why I would use a container to provide a reproducible environment in which to serve that model, but it’s MLflow’s responsibility to do the actual serving.

I think it would be better to separate responsibilities here: containers provide a reproducible environment for model serving, but MLflow does the serving independently of the container.

Option 2: Don’t install the package

Consider the original problem of crating model objects along with the helper functions required to work with them. There’s another solution to this, and it’s suggested by examples in the carrier documentation: I can take my functions, rip them out of their package environment, and stick them into the crate environment.

This is done through the rlang::set_env function, which returns a copy of the function in a new environment. If I don’t specify the environment, it defaults to the caller environment, which in the case below is that of crate:

triple <- function(x) 3*x
square <- function(x) x**2
triplesquare <- function(x) triple(square(x))
fn <- carrier::crate(
  function(x) triplesquare(x),
  triplesquare = rlang::set_env(triplesquare),
  square = rlang::set_env(square),
  triple = rlang::set_env(triple)
)
rm(triple, square, triplesquare)
fn(2)
#> [1] 12

Metaprogramming is magic

I don’t want to manually type out every single function in my package. But because I’m using R I don’t have to. In R I can formulate the expressions I want to evaluate but do the evaluating later. This is called non-standard evaluation or metaprogramming. Let’s suppose I have a vector of names of functions I want to apply the set_env treatment to. So, for c("triplesquare", "square", "triple"):

library(rlang)
triple <- function(x) 3*x
square <- function(x) x**2
triplesquare <- function(x) triple(square(x))

functions_to_crate <- c("triplesquare", "square", "triple")
functions_to_set_env <- lapply(functions_to_crate, function (x) {
  expr(set_env(!!sym(x)))
})
names(functions_to_set_env) <- functions_to_crate

fn <- carrier::crate(
  function(x) triplesquare(x),
  !!!functions_to_set_env
)
rm(triple, square, triplesquare)
fn(2)
#> [1] 12

Metaprogramming is one of the trickier parts of R. It’s not a standard feature of programming languages, so anyone who isn’t coming from a lisp background is likely to be confused. I’ll break down what’s happening here, but for a full introduction to metaprogramming there’s no better resource than Advanced R.

I’m using the rlang package which provides a nicer metaprogramming interface with a few more features. The core idea here is that sometimes I want to save an expression to be evaluated for later (with expr), but sometimes I want to evaluate it right now (with !!) — a concept called quasiquotation. Consider the example below:

expr(set_env(!!sym(x)))

I’m giving R here an expression set_env(!!sym(x)) but, because I’ve wrapped it in expr, I’m telling R not to evaluate it immediately. Except there is a part here that I do want to evaluate immediately: x is a character that I want to convert into a symbolic value. That is, I want to convert "triple" into triple. I can do this with the sym function and, by prefacing it with !!, I can tell R to ignore the expr and do this conversion immediately:

expr(set_env(!!sym("triple")))
#> set_env(triple)

I can see how this expression would be evaluated by directly inspecting the abstract syntax tree (AST) with lobstr. First, letting x <- "triple" and without using !!:

lobstr::ast(expr(set_env(sym("triple"))))
#> expr 
#> └─set_env 
#>   └─sym 
#>     └─"triple"

…and now with the !!:

x <- "triple"
lobstr::ast(expr(set_env(!!sym("triple"))))
#> expr 
#> └─set_env 
#>   └─triple

The !! forces the evaluation of the AST at sym("triple"), without evaluating the rest of the expression. So the expression I have at the end is just set_env(triple).

I’ve now got an expression that I can evaluate when I want and in whatever environment I want. I’m generating code with code! And with lapply I can generate an expression like this for every function, and end up with a named list of expressions.

I’ve saved these expressions so that I can evaluate them in the call to crate, which will copy every function into the crate environment. I do this with !!! (or “bang-bang-bang”). This forces the evaluation of every element in my list of expressions and uses them as arguments to the crate function:

fn <- carrier::crate(
  function(x) triplesquare(x),
  !!!functions_to_set_env
)

Crate everything

I have a method for taking a character vector of functions and including them in a crated function. I’m going to apply that method to a model package, in which all of my functions are in the package namespace:

# Already defined and loaded: model_package_path
loadd(review_rf, vectoriser, tfidf)
package_name <- pkgload::pkg_name(model_package_path)
package_namespace_ls <- ls(getNamespace(package_name))
package_contents_to_set_env <- lapply(package_namespace_ls, function (x) {
  rlang::expr(rlang::set_env(!!rlang::sym(x)))
})
names(package_contents_to_set_env) <- package_namespace_ls
crated_model <- carrier::crate(
  function(review) {
    sentiment(review, review_rf, vectoriser, tfidf)
  },
  review_rf = review_rf,
  vectoriser = vectoriser,
  tfidf = tfidf,
  !!!package_contents_to_set_env
)
crated_model
#> Registered S3 method overwritten by 'pryr':
#>   method      from
#>   print.bytes Rcpp
#> <crate> 8.03 MB
#> * function: 18.1 kB
#> * `create_tfidf`: 8.02 MB
#> * `create_vocabulary`: 8.02 MB
#> * `download_and_read_data`: 8.02 MB
#> * `execution_plan`: 8.02 MB
#> * `generate_roc`: 8.02 MB
#> * `map_to_dtm`: 8.02 MB
#> * `new_data_to_be_scored`: 8.02 MB
#> * `read_review_file`: 8.02 MB
#> * `sentiment`: 8.02 MB
#> * `stem_tokeniser`: 8.02 MB
#> * `text_preprocessor`: 8.02 MB
#> * `training_plan`: 8.02 MB
#> * `review_rf`: 7.44 MB
#> * `tfidf`: 280 kB
#> * `vectoriser`: 85.8 kB
#> function(review) {
#>     sentiment(review, review_rf, vectoriser, tfidf)
#>   }

How cool is that? Everything in the model package is now also in the crated model, and it was all picked up automatically. There seems to be some issues with the print of this packaged_model, as those individual functions are not 8MB each. The actual crate is around 8MB, which compresses to under 1MB — roughly the same as the crate without the functions.

I can now export this packaged model with mlflow::mlflow_save_model(packaged_model, "artefacts/model"). I’ll still use a Docker image for reproducibility, and the Dockerfile will look almost identical to the first one:

FROM rocker/r-ver:4.0.0
ENV RENV_VERSION 0.10.0
ENV CRAN_REPO https://packagemanager.rstudio.com/all/__linux__/focal/latest
ENV MINICONDA_INSTALLER Miniconda3-py38_4.8.3-Linux-x86_64.sh
# Copy the entirety of the context into the image. This should be the R package source.
ADD renv.lock /

# Install system dependencies. I couldn't get sysreqs to work here, since python-minimal
# isn't available on this implementation of rocker. Curl is required to download Miniconda.
RUN apt-get -y update && \
    apt-get install -y curl libgit2-dev libssl-dev zlib1g-dev \
    pandoc pandoc-citeproc make libxml2-dev libgmp-dev libgfortran4 \
    libcurl4-openssl-dev libssh2-1-dev libglpk-dev git-core

RUN Rscript -e "install.packages('remotes', repos = c(CRAN = Sys.getenv('CRAN_REPO')))"
RUN Rscript -e "remotes::install_github('rstudio/renv', ref = Sys.getenv('RENV_VERSION'))"
RUN Rscript -e "renv::restore(repos = c(CRAN = Sys.getenv('CRAN_REPO')))"

# Install miniconda to /miniconda and install mlflow
RUN curl -LO https://repo.anaconda.com/miniconda/$MINICONDA_INSTALLER
RUN bash $MINICONDA_INSTALLER -p /miniconda -b
RUN rm $MINICONDA_INSTALLER
ENV PATH=/miniconda/bin:${PATH}
RUN pip install mlflow
ENV MLFLOW_BIN /miniconda/bin/mlflow
ENV MLFLOW_PYTHON_BIN /miniconda/bin/python

ENTRYPOINT ["/usr/bin/env"]

A couple of differences between this Dockerfile and the first one:

  • I’m only copying one file from the model package — the renv.lock file. That is, the only information I’m baking into the image from the model package is the list of package dependencies.
  • I’m no longer installing the model package into the image.
  • I’m no longer running mlflow models serve within the image itself. The image is just an environment in which commands are run.

This last point is a pretty big deal. I’ve changed my approach to reproducibility here by introducing a line between environment and model. I can change the model by running a new crated model. I can do this without having to rebuild the image, because the model is no longer baked into the image. I could even share a single image across multiple containers.

There’s one snag here: renv dependencies. If I add another package as a dependency to the model, I’ll need to rebuild the image. It’s possible to use caching to speed things up, but I wonder if it’s possible to use the RStudio Package Manager to pin our dependencies by date, and then have the image install new packages as needed? That way, as long as I use the same date-locked repository in both development and the Dockerfile, I won’t have to rebuild the image every time I introduce a new dependency. My Docker skills aren’t up to this task, but it doesn’t sound impossible.

I’ll build the image as before, but give it a different tag:

Docker build --tag review-sentiment-env-only .

My Docker image contains only the environment, so docker run is a little different. I mount the exported model as a volume within the container, and I give the mlflow models serve command when I run the image, not when I build it.

docker run -p 5000:5000 -v $(pwd)/artefacts/model:/model review-sentiment-env-only mlflow models serve -m model --host 0.0.0.0 --port 5000

But that doesn’t work

I really thought this would work, but when I try to query the API I get a dependency issue:

curl -X POST "http://127.0.0.1:5000/predict/" -H  "accept: application/json" -H  "Content-Type: application/json" -d "\"love\""
# Invalid Request.  could not find function "%>%"

Earlier I mentioned that crate expects specifically declared functions. I couldn’t use predict.lm for a linear model; I had to use stats::predict.lm. Well in my ReviewSentimentMLflow package I import %>% from dplyr/magrittr and use it without the double colon reference. That’s why R can’t find %>% here: it doesn’t know what namespace it’s in. Importing dplyr or magrittr won’t fix this issue either, since R won’t know to look in those namespaces.

I don’t want to have to type magrittr::`%>%` every time I want to pipe, so I’ll have to include this function in the call to crate. I won’t use rlang::set_env this time, because I want these functions to keep their namespaces. When I implemented %>% I also noticed that the S3 method randomForest:::predict.randomForest was being called in the sentiment function. Both of these functions are included below:

crated_model <- carrier::crate(
  function(review) {
    sentiment(review, review_rf, vectoriser, tfidf)
  },
  review_rf = review_rf,
  vectoriser = vectoriser,
  tfidf = tfidf,
  !!!package_contents_to_set_env,
  "%>%" = magrittr::`%>%`,
  "predict.randomForest" = randomForest:::predict.randomForest
)

It’s a lot of work, declaring all of these dependencies, but now my MLflow model is being successfully served:

curl -X POST "http://127.0.0.1:5000/predict/" -H  "accept: application/json" -H  "Content-Type: application/json" -d "\"love\""
# "good"

crate has all of the dependencies now, but declaring those dependencies looks very hacky. I’m not sure if I’d call this a solution.

But this does work!

After I published this post, Nick DiQuattro came up with a great idea: stick the environment of the model package into the crate function. And it works!

According to the documentation for rlang::ns_env, the package namespace is an environment where all of the functions of the package live. “The parent environments of namespaces are the imports environments, which contain all the functions imported from other packages”. So I’m going to take those imported functions and stick them into crate, without having to manually declare each one.

The process is similar to defining package_contents_to_set_env:

import_env <- ns_imports_env(package_name)
imported_functions_names <- ls(import_env)
imported_functions_to_declare <- lapply(
  imported_functions_names,
  function(x) expr(import_env[[!!x]])
)
names(imported_functions_to_declare) <- imported_functions_names

Now my crate call looks like this:

crated_model <- carrier::crate(
  function(review) {
    sentiment(review, review_rf, vectoriser, tfidf)
  },
  review_rf = review_rf,
  vectoriser = vectoriser,
  tfidf = tfidf,
  !!!package_contents_to_set_env,
  !!!imported_functions_to_declare
)
crated_model
#> <crate> 8.1 MB
#> * function: 12 kB
#> * `create_tfidf`: 8.09 MB
#> * `create_vocabulary`: 8.09 MB
#> * `download_and_read_data`: 8.09 MB
#> * `execution_plan`: 8.09 MB
#> * `generate_roc`: 8.09 MB
#> * `map_to_dtm`: 8.09 MB
#> * `new_data_to_be_scored`: 8.09 MB
#> * `read_review_file`: 8.09 MB
#> * `sentiment`: 8.09 MB
#> * `stem_tokeniser`: 8.09 MB
#> * `text_preprocessor`: 8.09 MB
#> * `training_plan`: 8.09 MB
#> * `review_rf`: 7.44 MB
#> * `tfidf`: 280 kB
#> * `vectoriser`: 85.8 kB
#> * `%>%`: 30.8 kB
#> * `system.file`: 17.5 kB
#> * `trigger`: 14.5 kB
#> * `library.dynam.unload`: 8.31 kB
#> * `randomForest`: 1.14 kB
#> function(review) {
#>     sentiment(review, review_rf, vectoriser, tfidf)
#>   }

If I run my environment Docker image and serve this crated model, it works! It’s still a bit hacky, but not as bad as manually declaring every imported function. And, because randomForest::randomForest is an imported function in the NAMESPACE, that carries along the S3 method predict.randomForest. Which means that I can just use predict in my internal funtions, and R will be able to dispatch correctly.

This only works because in a package workflow I declare my imported functions carefully with Roxygen tags. So the namespace contains lines like importFrom(randomForest,randomForest). And if I’m not importing functions, I’m using them with double colons like dplyr::mutate. Because of this, crate knows where to find the functions I’m using.

Thank you so much Nick!

MLflow and R

Overall, I don’t feel confident using MLflow to deploy and serve an R model. The support through the carrier package is promising, but not yet mature enough to serve anything other than simple models with simple preprocessing. I’ve had to get around this by applying some metaprogramming hacks.

I think the carrier package is a great approach to exporting an R model, and that the ability to export an arbitrary function would be more flexible than exporting an object in a given machine learning framework. But the package needs more power in terms of dependency detection.

It’s reasonable to expect that R models will be developed in package workflows, so that users can take advantage of powerful packages like devtools, testthat, and roxygen2. Dependencies are clearly declared in package workflows, and R CMD check will yell at the user if a dependency isn’t listed. Given this, I think that carrier and MLflow can be advanced together by implementing automatic detection of dependencies within a package workflow.

In particular, carrier could be improved by

  1. supporting the importation of all functions within a given package into a crate call,
  2. supporting the importation of all declared imports within a NAMESPACE file (which would cover the above %>% issue), and
  3. providing better support for S3 dispatch.

I’d be cautious of doing this sort of global package import with arbitrary packages (it would fall apart as soon as it encounters compiled code), but I think it’s suitable for package workflows or “models as packages”. I’ve also heard of a package called defer which may do some of the above, although I haven’t looked into it.

I’ve outlined some approaches here for accommodating a package workflow with carrier, so there’s possibly some room here for me to contribute.


devtools::session_info()
#> ─ Session info ───────────────────────────────────────────────────────────────
#>  setting  value                       
#>  version  R version 4.0.0 (2020-04-24)
#>  os       Ubuntu 20.04 LTS            
#>  system   x86_64, linux-gnu           
#>  ui       X11                         
#>  language en_AU:en                    
#>  collate  en_AU.UTF-8                 
#>  ctype    en_AU.UTF-8                 
#>  tz       Australia/Melbourne         
#>  date     2020-07-06                  
#> 
#> ─ Packages ───────────────────────────────────────────────────────────────────
#>  ! package               * version    date       lib
#>    askpass                 1.1        2019-01-13 [1]
#>    assertthat              0.2.1      2019-03-21 [1]
#>    backports               1.1.8      2020-06-17 [1]
#>    base64enc               0.1-3      2015-07-28 [1]
#>    base64url               1.4        2018-05-14 [1]
#>    callr                   3.4.3      2020-03-28 [1]
#>    carrier                 0.1.0      2018-10-16 [1]
#>    cli                     2.0.2      2020-02-28 [1]
#>    codetools               0.2-16     2018-12-24 [4]
#>    colorspace              1.4-1      2019-03-18 [1]
#>    crayon                  1.3.4      2017-09-16 [1]
#>    curl                    4.3        2019-12-02 [1]
#>    data.table              1.12.8     2019-12-09 [1]
#>    desc                    1.2.0      2018-05-01 [1]
#>    devtools                2.3.0      2020-04-10 [1]
#>    digest                  0.6.25     2020-02-23 [1]
#>    downlit                 0.0.0.9000 2020-06-15 [1]
#>    dplyr                   0.8.5      2020-03-07 [1]
#>    drake                 * 7.12.2     2020-06-02 [1]
#>    ellipsis                0.3.1      2020-05-15 [1]
#>    evaluate                0.14       2019-05-28 [1]
#>    fansi                   0.4.1      2020-01-08 [1]
#>    filelock                1.0.2      2018-10-05 [1]
#>    float                   0.2-4      2020-04-22 [1]
#>    forge                   0.2.0      2019-02-26 [1]
#>    fs                      1.4.1      2020-04-04 [1]
#>    generics                0.0.2      2018-11-29 [1]
#>    ggplot2                 3.3.0      2020-03-05 [1]
#>    glue                    1.4.1      2020-05-13 [1]
#>    gtable                  0.3.0      2019-03-25 [1]
#>    here                    0.1        2017-05-28 [1]
#>    hms                     0.5.3      2020-01-08 [1]
#>    htmltools               0.5.0      2020-06-16 [1]
#>    httpuv                  1.5.2      2019-09-11 [1]
#>    httr                    1.4.1      2019-08-05 [1]
#>    hugodown                0.0.0.9000 2020-06-20 [1]
#>    igraph                  1.2.5      2020-03-19 [1]
#>    ini                     0.3.1      2018-05-20 [1]
#>    janeaustenr             0.1.5      2017-06-10 [1]
#>    jsonlite                1.6.1      2020-02-02 [1]
#>    knitr                   1.28       2020-02-06 [1]
#>    later                   1.1.0.1    2020-06-05 [1]
#>    lattice                 0.20-41    2020-04-02 [4]
#>    lgr                     0.3.4      2020-03-20 [1]
#>    lifecycle               0.2.0      2020-03-06 [1]
#>    lobstr                  1.1.1      2019-07-02 [1]
#>    magrittr                1.5        2014-11-22 [1]
#>    Matrix                  1.2-18     2019-11-27 [4]
#>    memoise                 1.1.0.9000 2020-05-09 [1]
#>    mlapi                   0.1.0      2017-12-17 [1]
#>    mlflow                  1.9.0      2020-06-22 [1]
#>    munsell                 0.5.0      2018-06-12 [1]
#>    NLP                     0.2-0      2018-10-18 [1]
#>    openssl                 1.4.1      2019-07-18 [1]
#>    pillar                  1.4.4      2020-05-05 [1]
#>    pkgbuild                1.0.7      2020-04-25 [1]
#>    pkgconfig               2.0.3      2019-09-22 [1]
#>    pkgload                 1.0.2      2018-10-29 [1]
#>    plotROC                 2.2.1      2018-06-23 [1]
#>    prettyunits             1.1.1      2020-01-24 [1]
#>    processx                3.4.2      2020-02-09 [1]
#>    progress                1.2.2      2019-05-16 [1]
#>    promises                1.1.0      2019-10-04 [1]
#>    pryr                    0.1.4      2018-02-18 [1]
#>    ps                      1.3.3      2020-05-08 [1]
#>    purrr                   0.3.4      2020-04-17 [1]
#>    R6                      2.4.1      2019-11-12 [1]
#>    randomForest            4.6-14     2018-03-25 [1]
#>    Rcpp                    1.0.4.6    2020-04-09 [1]
#>    readr                   1.3.1      2018-12-21 [1]
#>    remotes                 2.1.1      2020-02-15 [1]
#>    reticulate              1.15       2020-04-02 [1]
#>  R ReviewSentimentMLflow * 0.1.0      <NA>       [?]
#>    RhpcBLASctl             0.20-17    2020-01-17 [1]
#>    rlang                 * 0.4.6      2020-05-02 [1]
#>    rmarkdown               2.3.1      2020-06-20 [1]
#>    rprojroot               1.3-2      2018-01-03 [1]
#>    rsparse                 0.4.0      2020-04-01 [1]
#>    rstudioapi              0.11       2020-02-07 [1]
#>    scales                  1.1.0      2019-11-18 [1]
#>    sessioninfo             1.1.1      2018-11-05 [1]
#>    slam                    0.1-47     2019-12-21 [1]
#>    SnowballC               0.7.0      2020-04-01 [1]
#>    storr                   1.2.1      2018-10-18 [1]
#>    stringi                 1.4.6      2020-02-17 [1]
#>    stringr                 1.4.0      2019-02-10 [1]
#>    swagger                 3.9.2      2018-03-23 [1]
#>    testthat              * 2.3.2      2020-03-02 [1]
#>    text2vec                0.6        2020-02-18 [1]
#>    tibble                  3.0.1      2020-04-20 [1]
#>    tidyselect            * 1.0.0      2020-01-27 [1]
#>    tidytext                0.2.4      2020-04-17 [1]
#>    tm                      0.7-7      2019-12-12 [1]
#>    tokenizers              0.2.1      2018-03-29 [1]
#>    txtq                    0.2.0      2019-10-15 [1]
#>    usethis                 1.6.1      2020-04-29 [1]
#>    vctrs                   0.3.1      2020-06-05 [1]
#>    withr                   2.2.0      2020-04-20 [1]
#>    xfun                    0.14       2020-05-20 [1]
#>    xml2                    1.3.2      2020-04-23 [1]
#>    yaml                    2.2.1      2020-02-01 [1]
#>    zeallot                 0.1.0      2018-01-28 [1]
#>  source                            
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  Github (r-lib/downlit@9191e1f)    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  Github (r-lib/hugodown@f7df565)   
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  Github (hadley/memoise@4aefd9f)   
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  <NA>                              
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  Github (rstudio/rmarkdown@b53a85a)
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#>  CRAN (R 4.0.0)                    
#> 
#> [1] /home/mdneuzerling/R/x86_64-pc-linux-gnu-library/4.0
#> [2] /usr/local/lib/R/site-library
#> [3] /usr/lib/R/site-library
#> [4] /usr/lib/R/library
#> 
#>  R ── Package was removed from disk.

The image at the top of this page is in the public domain.