web scraping - "Rescue" command in R? -


i have code:

library(rvest) url_list <- c("https://github.com/rails/rails/pull/100",                 "https://github.com/rails/rails/pull/200",                 "https://github.com/rails/rails/pull/300")  mine <- function(url){   url_content  <- html(url)   url_mainnode <- html_node(url_content, "*")   url_mainnode_text <- html_text(url_mainnode)   url_mainnode_text <- gsub("\n", "", url_mainnode_text) # clean text   url_mainnode_text }  messages <- lapply(url_list, mine) 

however, make list longer tend run a

error in html.response(r, encoding = encoding) :    server error: (500) internal server error  

i know in ruby can use rescue keep iterating through list, though attempts @ applying function fails. there similar in r?

one option use try(). more info, see here. here's implementation:

library(rvest) url_list <- c("https://github.com/rails/rails/pull/100",                 "https://github.com/rails/rails/pull/200",                 "https://github.com/rails/rails/pull/300")  mine <- function(url){   try(url_content  <- html(url))   url_mainnode <- html_node(url_content, "*")   url_mainnode_text <- html_text(url_mainnode)   url_mainnode_text <- gsub("\n", "", url_mainnode_text) # clean text   url_mainnode_text }  messages <- lapply(url_list, mine) 

Comments

Popular posts from this blog

python - mat is not a numerical tuple : openCV error -

c# - MSAA finds controls UI Automation doesn't -

wordpress - .htaccess: RewriteRule: bad flag delimiters -