SBYC Wet Wednesday July 10 2019 on J70 Escape
Video created with a GoPro Hero 7 recorded at 1080p60. Data from GoPro Hero 7 internal GPS. Overlay software: Dashware.
Video created with a GoPro Hero 7 recorded at 1080p60. Data from GoPro Hero 7 internal GPS. Overlay software: Dashware.
SBYC Wet Wednesday, June 19 2019. We thought we had a good start but got caught in dirty air. Probably should have just embraced the suck on the way up. Still, a fun race.
Wind |
HI |
16 mph |
4:44 PM |
Wind |
LO |
0 mph |
3:54 AM |
Wind Gust |
HI |
19 mph |
4:44 PM |
Wind Gust |
LO |
0 mph |
5:49 AM |
Air Temp |
HI |
65 °F |
3:29 PM |
Air Temp |
LO |
60 °F |
8:54 AM |
One issue that I have been having with sailing is too much sun. I have had friends and doctors giving me grief that I do not use enough sunblog. This race I covered up. I need to find a better hat and sunblock that actually works for me.
Another gotcha that I learned about using the GoPro Hero Black 7, is that you do have to sync the time and date before using. I was able to use GoPro qwik and mp4joiner, but was not able to get Race Rederer to work. Also another gotcha is that the unit needs to be turned on before recording to get the GPS to sync up.
We went out in some very heavy winds. Not many boats went out that day. Might have to do with the wind. In the video, you may hear a lot of yelling "STARBOARD." The video does not show it but we had a very close call.
All in all a very fun day!
To give you an idea of what it was like:
Wind |
HI |
23 mph |
2:24 PM |
Wind Gust |
HI |
30 mph |
3:09 PM |
This is onshore. What I am told is that wind speed is about 4 mph faster on the race course.
Boat Speed Summary in MPH from my GoPro:
Min. |
1st Qu. |
Median |
Mean |
3rd Qu. |
Max. |
---|---|---|---|---|---|
0.008948 |
5.820518 |
6.612395 |
6.711321 |
7.874029 |
14.414841 |
What got me was, this was without the spinacker up!
SBYC Wet Wednesday, May 8, 2019, on J70 Escape. Had a good start and first leg.
Speed summary in Knots:
Min. 1st Qu. Median Mean 3rd Qu. Max. 1.985 4.607 5.132 5.105 5.594 7.797
Youtube video processed with GoPro Quik and MP4Joiner:
Youtube video processed with RaceRenderer:
It takes a long time to render a video for youtube. I timed myself once. It is about five hours on a good day. It takes about twenty minutes to transfer close to 23 gigabytes of video. To make matters worse, GoPro records in chapters that are about 3.97 gigabytes long. The 78-minute video is five chapters.
It is not easy to create a long video. The tools for stitching the videos together with gauges and telemetry is lacking. If I create the video with GoPro’s Quik, I have to first turn on the gauges that I want, then create a clip. Lastly, I use a program called mp4joiner.
Another option would be the preferred Dashware. Problem with Dashware is just not supported. It does not extract the telemetry data properly.
I am currently trying out RaceRenderer. What is good about it is that it does join the videos, extract the telemetry properly and saves a CSV for later analysis. It has plenty of gauges. Problem is, I like the look of the gauges from GoPro’s quick better. Well, you can’t have it all.
If my team likes RaceRenderer better, I might just go forward. I have done the work for the template. It just might make it easy on me to get the job of processing the videos.
If you have an oppinion about the two methods, let me know in the comments.
Wet Wednesday on J70 Escape. An interesting sail. We hit a mark, broached, plus a few others. Off of Leadbetter beach:
Wind HI 19 mph 5:44 PM Wind Gust HI 24 mph 4:54 PM
Boat Spead during the race (In MPH):
Min. 1st Qu. Median Mean 3rd Qu. Max. 0.000 5.615 6.353 6.540 7.438 15.421
This Wednesday was better than expected. All the weather reports suggested that the wind would be less than ten knots. We had some mild white caps instead.
We are doing better. We were not in the front of the pack, but we kept up with the others. On a downwind run, I allowed the boat behind us to steal wind. Plus I forgot that I was on starboard. I could have some fun on that one.
The video I got is time-lapsed. Not as fun (for me) than regular video. No sound.
I used to sail light. Now I have at least one more than necessary. I actually find it comforting not to have to worry about not having enough hands on board. I can single and double-hand my j70, just do not feel like it lately.
Last Wet Wednesday was special for me. We won. My boat Escape and my team were not even expected to come even close to the other three competors. I did not have my new regatta sails up. Yet we won. We had an excellent start. We sailed a conservative race. Our spinnaker douses where early so that we would not pass the mark. All in all, it was a good day.
Since the GPS data on my GoPro was a little wonky, this a summary from the extracted files with anything over 20 statute miles per hour deleted:
Min. |
1st Qu. |
Median |
Mean |
3rd Qu. |
Max. |
---|---|---|---|---|---|
3.467 |
6.152 |
6.747 |
7.067 |
7.880 |
12.974 |
It 12+ mph or 11.3 knots is not bad!
I was surprised that we made it onto drone footage. It was very cool to see us on one of our better days! Starts at 15:08.
Good videos for J/70 handling. Section 3 - Tack and Gybe is put first since it is basic sailing of the boat. I am adding more videos from North Sails that I feel is good for learning how to sail the boat.
Step by step video on how to tack and gybe the J/70
Step by step video on how to set up the J/70 spinnaker
Step by step video on how to take down the J/70 spinnaker
The fleet I sail in, all the top sailers do it. It may not be as fast or fun, but in the right conditions you cannot beat the VMG. A good article on VMG can be found here: Why VMG Matters
This is my code for creating O’Shaughnessy’s Trending Value screen. Not my best code since it was origionally for my personal and private consumption. It uses AAII Stock Investor Pro's exported data. I figured that it might not be bad to share.
# Trend Value #https://www.valuesignals.com/Screens/Details/OShaughnessy_Trending_Value #fix dplr undef13 if(!require("pacman")){ install.packages("pacman") } rm(list = ls(all = TRUE)) # library(dplyr) # library(quantmod) # library(xlsx) pacman::p_load(bindrcpp,dplyr, quantmod, xlsx, rio) #Momentum will be Regression Relative Strength instead. 126 == 6 month # 21,42,63,126,189 trenddays <- 126 SIGMA <- 0 ifelse(dir.exists("C:/Users/msghe/OneDrive/Stocks/R"), setwd("C:/Users/msghe/OneDrive/Stocks/R"), setwd("C:/Users/michael/SkyDrive/Stocks/R")) stockdata <- read.csv("data/WEEKLY.TXT", header = FALSE, stringsAsFactors = FALSE, na.strings = '-99999999.990') stocknames <- read.csv( "data/WEEKLY_KEY.TXT", header = FALSE, stringsAsFactors = FALSE, na.strings = '-99999999.990' ) stocknames[, 1] names(stockdata) <- stocknames[, 1] #clean ticker name for yahoo finance stockdata$TICKER <- gsub('.', '-', stockdata$TICKER, fixed = TRUE) # Remove bad tickers #NA bad data # stockdata[stockdata == -99999999.990] <- NA stockdata <- stockdata[complete.cases(stockdata$TICKER),] #Create all stock universe # All stocks is 150 million in 1995 $$. It is adjusted to todays dollors. All # stocks exclude over the counter. # min.mark.cap <- quantile(stockdata$MKTCAP, na.rm = TRUE)[[2]] #find inflation Market Cap 150 @ 1995 getSymbols("CPIAUCSL", src = "FRED") deflator <- last(Cl(to.yearly(CPIAUCSL)))[[1]] / Cl(to.yearly(CPIAUCSL))['1995'][[1]] tvminmarketcap <- 150 * deflator allstock <- stockdata[stockdata$MKTCAP > tvminmarketcap,] # stockdata[stockdata$MKTCAP > quantile(stockdata$MKTCAP, na.rm = TRUE)[[4]],] #Minimum market cap for later sanity #I will not invest in over the counter condition <- c("N - New York", "A - American", "M - Nasdaq") allstock <- filter(allstock, EXCHG_DESC %in% condition) # min.mark.cap <- median(allstock$MKTCAP, na.rm = TRUE) # min.mark.cap <- quantile(allstock$MKTCAP, na.rm = TRUE)[[2]] #add market cap requirement # allstock <- filter(allstock, MKTCAP > min.mark.cap) #Minimum Market Cap min(allstock$MKTCAP) #Start Ranking the stocks #Calculate VC2 #subtract ntile from 101 to reverse (correct) Order. So Small is big allstock$PBVPS.Rank <- 101 - ntile(allstock$PBVPS, 100) allstock$PE.Rank <- 101 - ntile(allstock$PE, 100) allstock$PSPS.Rank <- 101 - ntile(allstock$PSPS, 100) allstock$EVEDA_12M.Rank <- 101 - ntile(allstock$EVEDA_12M, 100) allstock$PCFPS.Rank <- 101 - ntile(allstock$PCFPS, 100) allstock$SHY.Rank <- ntile(allstock$SHY, 100) #Stocks with no rank get 50 allstock$PBVPS.Rank[is.na(allstock$PBVPS.Rank)] <- 50 allstock$PE.Rank[is.na(allstock$PE.Rank)] <- 50 allstock$PSPS.Rank[is.na(allstock$PSPS.Rank)] <- 50 allstock$EVEDA_12M.Rank[is.na(allstock$EVEDA_12M.Rank)] <- 50 allstock$PCFPS.Rank[is.na(allstock$PCFPS.Rank)] <- 50 allstock$SHY.Rank[is.na(allstock$SHY.Rank)] <- 50 #Sum the Ranks allstock$SumRank <- allstock$PBVPS.Rank + allstock$PE.Rank + allstock$PSPS.Rank + allstock$EVEDA_12M.Rank + allstock$PCFPS.Rank + allstock$SHY.Rank # Calculate VC2 allstock$VC2 <- ntile(allstock$SumRank, 100) #Get top 10% of allstock tvstocks <- filter(allstock, VC2 > 70) #I like Fscore >= 5, Z score >=3 # However, just use Fscore >= 7 #Clean data tvstocks <- tvstocks[complete.cases(tvstocks$FSCORE_12M),] # tvstocks <- tvstocks[complete.cases(tvstocks$ZSCORE_Q1),] #Zscore Prime # tvstocks <- tvstocks[complete.cases(tvstocks$UDEF13),] #weed out must sells # tvstocks <- filter(tvstocks, FSCORE_12M >= 4, ZSCORE_Q1 >= 1.8) tvstocks <- filter(tvstocks, FSCORE_12M >= 7) #My Market Cap Needs # tvstocks <- filter(tvstocks, MKTCAP > min.mark.cap) # tvstocks <- filter(, YIELD > 0) #Final Cleaning # tvstocks <- tvstocks[complete.cases(tvstocks$RPS_6M),] # tvstocks <- tvstocks[complete.cases(tvstocks$RS_26W),] # tvstocks <- filter(tvstocks,RS_26W > 0) # tvstocks <- filter(tvstocks,RS_26W >= median(allstock$PRCHG_26W, na.rm = TRUE)) tvstocks <- filter(tvstocks,PRCHG_13W >= median(allstock$PRCHG_13W, na.rm = TRUE)) # head((screen[order(as.numeric(screen$SCORE), decreasing = TRUE),]),25) #[1] # head((screen[order(as.numeric(screen$SCORE), decreasing = TRUE),]),25) #[1] # names(stockdata) <- stocknames$V1 # screen.rpt <- left_join(screen, stockdata, by = "TICKER") # screen.rpt <- arrange(screen.rpt, desc(as.numeric(SCORE))) # screen.rpt$SCORE <- round(as.numeric(screen.rpt$SCORE), 4) # # screen.rpt <- arrange(screen.rpt, desc(as.numeric(PRCHG_26W))) # nrow(screen.rpt) # stockenv <- new.env() # getSymbols(tvstocks$TICKER, env = stockenv, adjust=TRUE) # getSymbols(tvstocks$TICKER, env = stockenv) # rm(screen) # Keep it simple, stop using RRS # tvstocks[,"krs26"] <- NA # for (i in ls(stockenv)) { # print(i) # # tvstocks[tvstocks$TICKER == i,"krs26"] <- last(Ad(stockenv[[i]]),1)[[1]]/last(SMA(Ad(stockenv[[i]]),126),1)[[1]] # tvstocks[tvstocks$TICKER == i,"krs26"] <- last(Ad(stockenv[[i]]),1)[[1]]/ mean(last(Ad(stockenv[[i]]),126)) # } screen.rpt <- arrange(tvstocks,desc(as.numeric(PRCHG_26W))) # screen.rpt <- arrange(tvstocks,desc(as.numeric(RPS_6M))) # screen.rpt <- arrange(tvstocks,desc(as.numeric(krs26))) screen.rpt <- select(screen.rpt, TICKER, COMPANY, SMG_DESC, PRCHG_26W, MKTCAP,FSCORE_12M,VC2) head(screen.rpt,25) write.xlsx( head(screen.rpt, 25), "MoneyPicks.xlsx", sheetName = "Money", append = FALSE, row.names = FALSE ) rio::export(head(screen.rpt, 25),"MoneyPicks.csv")
Out of boredom on a rainy Saturday, I had reconfigured my blog. I kept the s3 bucket, but changed DNS providers and started using CloudFront. One of the reports is called: CloudFront Popular Objects Report. Sad to say, the most popular object had to do with a WordPress live writer exploit.
/xmlrpc.php /wp2/wp-includes/wlwmanifest.xml /wp1/wp-includes/wlwmanifest.xml /wp/wp-includes/wlwmanifest.xml /wp-includes/wlwmanifest.xml /wordpress/wp-includes/wlwmanifest.xml /website/wp-includes/wlwmanifest.xml /web/wp-includes/wlwmanifest.xml /test/wp-includes/wlwmanifest.xml /sito/wp-includes/wlwmanifest.xml /site/wp-includes/wlwmanifest.xml /shop/wp-includes/wlwmanifest.xml /news/wp-includes/wlwmanifest.xml /media/wp-includes/wlwmanifest.xml /cms/wp-includes/wlwmanifest.xml /blog/wp-includes/wlwmanifest.xml /2018/wp-includes/wlwmanifest.xml /2017/wp-includes/wlwmanifest.xml /2016/wp-includes/wlwmanifest.xml /2015/wp-includes/wlwmanifest.xml