It almost goes without saying that programming languages and data science work well together. Most popular programming languages have powerful libraries dedicated to data science and statistical computing. But languages such as R take this to the next level. R, and its predecessor S, were specifically designed around statistical methodology. But don’t think for a second that this implies R is difficult to use.
In fact, the Shiny environment makes it easy to create R applications with rich GUIs that run on standard web servers and browsers. This form of development and deployment is quite different from what you find in most other programming languages. To the point where it’s often difficult to know exactly what R Shiny can do. If you take a look at the following R Shiny examples and ideas you’ll see just what the platform is capable of. Then you can tackle these concepts on your own or use them as a springboard to develop your own.
Introductory Level Projects
Every journey begins with a single step. And every foray into a new programming language begins by learning to use its development tools and syntax. We’ll begin our look at R Shiny concepts with some projects that are well within the reach of someone new to the language and framework. But make no mistake. These examples and ideas aren’t just educational. They provide a solid utility that can be easily leveraged within your own projects.
1. Take your coding into the cloud
Anyone who uses multiple programming languages on a regular basis can attest to some common annoyances. You’re often stuck managing resource-heavy development environments. What’s more, if you’re coding in multiple environments you might need to take care of managing it all on different pieces of hardware. Even keeping your codebase in sync can be tedious. But R Shiny has a unique solution to that problem. And it’s something that anyone starting out with R Shiny should learn about before doing anything else.
RStudio Cloud solves all of these problems and more by taking R Shiny development into the cloud. The IDE is essentially an HTML/JS frontend over a standard remote server with all of R’s prerequisites. This type of setup is an amazing boon to people just getting started with the platform. But the benefits carry on well beyond those first steps. You’re freed from code versioning issues because of RStudio Cloud’s remote nature. No matter what device you’re using to access your code it’s all working on the same server.
And even deployment becomes a non-issue. One of the common problems with web-based GUIs is the need to carefully package programs for the end users. But you can easily deploy code written in RStudio Cloud with just a single click of the “publish” button. This also ensures that your code is relatively platform-agnostic. Even iOS and Android users can use web apps deployed through RStudio Cloud. There’s no need to, for example, embed code into an Android webview-based app to get it onto Android devices.
2. Polish Your Code for Mobile Platforms
If you tried the previous idea on a mobile device you probably noticed two things. The first, of course, is that the R code works perfectly on mobile. But you probably also noticed that it doesn’t feel much like a truly native app. But it’s extremely easy to polish an R Shiny app to make it feel more native. The easiest way to accomplish this is to use shinyMobile. Installing the shinyMobile package is a snap. On the command line, you just need to type the following.
And if you’re using RStudio Cloud you just go into the IDE’s tools and select “Install Packages”. Then type shinyMobile into the “Packages” field while keeping everything else in their default state.
From here you can use the Progressive Web App (PWA) system to make your app feel more native. For example, you can get rid of the navigation buttons and bars normally found on webpages.
3. Use REST
Systems designed around statistics and data science obviously need to be able to source that information. R makes it easy to load up data from local files. But a local intranet is nothing compared to the treasure trove of data on the Internet. And many sites even open up their data to 3rd parties through representational state transfer (REST).
Given R’s data-focused design you might imagine that it’d be easy to make REST calls. If so, you’d be right. R provides a number of ways to use REST. But the easiest method is to simply use the httr library. You can just make a GET call and pass the results to a variable. For example, like the following.
r <- GET(“http://example.com/ourdata”)
4. Make client-side API calls
You’ve seen how easy it is to make a REST call with R. This is made all the better by the fact that REST APIs have become so ubiquitous. Unfortunately, people making heavy use of REST have also led many services to lock down access. Some of the more popular sites with useful data limit the number of times any single IP address can use their REST API per day.
So far we’ve looked at smaller examples and ideas of what you can do with R Shiny. But emergent possibilities are one of the best parts of any programming language. As you master smaller parts of the platform you open up the possibility of much larger results. The following R Shiny concepts highlight some of the more advanced possibilities that will open up to you after mastering the basics.
5. Set up your own R Shiny server
The free cloud-based solutions are undeniably useful. But there are a lot of advantages to putting together your own R Shiny server. One of the biggest comes down to resources. You can use as much of the system’s memory and CPU as you want when using your own server. This can be especially important when you’re doing heavy calculations with huge data sets or using machine learning.
Setting up an R Shiny server is relatively straightforward. But it can be a little time-consuming if you’re not very familiar with Linux or server environments. There are also a few caveats to keep in mind. One of the biggest is how you source the R Shiny packages. The easiest way to go about installing R Shiny on a server is to start out with Ubuntu and then add the official R PPA. However, there’s an important caveat there. R only provides full support for Ubuntu’s Long Term Support (LTS) releases. So you’ll need to make sure that your server is running on an LTS release if you’re using Ubuntu on it.
6. See your city in a whole new way
You’ve already seen a few different methods to access REST data. And one interesting thing about the format is how common it’s become. If you look around your city or town you’ll probably find a wide variety of different publically accessible services that provide REST data. This often includes bus and train locations, traffic conditions, and weather.
You might even be able to find public data on people’s behavior. For example, crime statistics for particular areas, crowd congestion, how full public transit is, etc. You could easily combine all of that data through R Shiny and overlay it onto a map. If you wanted to take it even further you might work in machine learning to analyze points in the data which might have predictive value. Or you might be able to note patterns. For example, how weather conditions influence different behaviors in the larger population. If conditions X and Y correlate with full subway cars then you’d be able to know in advance if it was a bad day to ride the train. You can expand that predictive approach to a wide variety of different subjects.
7. Implement scraping and NLP to get a more personal touch
The previous example considered options to analyze human behavior within a limited geographical area. But what if we could take that a little further with R Shiny? For example, how would you go about mapping the general mood of people in a particular location? This might sound like science fiction. But you can actually implement the concept in R. After all, people’s mood is essentially data. And data analysis is where R really shines. Likewise Shiny in graphically representing that data.
Twitter’s an outstanding source of data that’s easy to sort by geo-tagging. And R has a great library for scraping Twitter posts called rtweet. R also has a wealth of great natural language processing systems. But the syuzhet package is often the best option for a lightweight NLP-based sentiment analysis system. This is in large part due to the fact that it concentrates most of its functionality around this central function.
Now imagine what would happen if you scraped Twitter posts by geo-tagging data to match your current location. You could then have R scrape that data to clean it up and pass the results to a syuzhet-based system for analysis. This could then be overlayed onto a map of your area. You could also expand the idea into other real-time data sources. This could eventually show you a real-time map of the mood in any area you’re near. This could warn you away from places where people are having a bad time and point you toward locations and events people are enjoying.