one step forward, two steps back. all of a sudden, the Spark clusters are requiring me to use HDFS for all file operations, meaning prepending "file://" to the paths I was previously using. it must be due to something I did, but I can't figure it out so I'm trying to deal with it. I put the "file://" into the Makefile definition, but then got the dreaded "multiple target patterns" error. so I've been modifying the Python script itself. royal pain in the ass.
Back to blog or home page
last updated 2017-02-01 23:11:03. served from tektonic