Script for downloading Youtube videos


Forum: Programming and Scripting
Topic: Script for downloading Youtube videos
started by: curaga

Posted by curaga on June 26 2007,16:13
Don't look at me like that, I know it isn't useful.
But I just read Advanced bash scripting guide and I had to try my skills on something.. So < here >'s a very much enhanced version of the one from nixcraft.

I added some looks, clarity, comments, help switch, ability to get urls from nearly unlimited number of args...

After download they can be watched with Mplayer or flash player, and edited with mencoder..

Try and comment it

Posted by WDef on June 26 2007,19:36
Hey Curaga,

Not sure about this:

Code Sample
[[ $1 = http://* ]]


Try instead:

Code Sample
[ ${1%%://*} = http ]


to test if the beginning of the argument string is http://

EDIT: forget it, what you have works.  I wasn't looking at the double brackets.

Posted by u2musicmike on June 30 2007,15:36
I couldn't get it working.  I think it bombs out on the mktemp part.  I tried another name for tmp and it got further along.  Then it looked like error 303 on the wget.  Oh well interesting script.
Posted by curaga on July 01 2007,05:44
I didn't create it on DSL, so mktemp might be missing. Just replace it with a name..
Isn't error 303 just a redirection? I get something like that always, Youtube has 3 servers to balance traffic..

Posted by curaga on July 01 2007,06:19
More detailed usage:

Go to the Youtube video page you wish to download. Like here's a video of DSL:
< http://www.youtube.com/watch?v=prjQ7MwuCWI >
Copy that url. Then open Aterm and start this script (youtubedl if you haven't renamed)
and when it asks for that url, paste it there..

Or call it as "youtubedl < http://www.youtube.com/watch?v=prjQ7MwuCWI" >

Posted by jpeters on July 04 2007,19:29
I copied the script to a file called "rip" , and ran "rip [the youtube URL]"; produced a "wget --n invalid option" error.  No doubt, I'm missing something obvious.

Usage: wget [-c|--continue] [-q|--quiet] [-O|--output-document file]
               [--header 'header: value'] [-Y|--proxy on/off] [-P DIR] url

Posted by ^thehatsrule^ on July 04 2007,20:40
That's probably due to the wget from busybox...
Quote
-nv
Non-verbose - turn off verbose without being completely quiet (use -q for that), which means that error messages and basic information still get printed.
Should be safe just to remove it.

Posted by curaga on July 05 2007,08:52
I did have full wget.. That seems the right solution for busybox version
Posted by jpeters on July 05 2007,19:50
Quote (curaga @ July 05 2007,04:52)
I did have full wget.. That seems the right solution for busybox version

Doesn't matter;  the script doesn't work even with the full wget installed.  I don't know much about mktemp, but I tried it with that and directing to a file  'mktp'

EDIT:  Problem was those little single quotes around mktemp didn't copy correctly; things almost working, although:

--00:56:25--  http://youtube.com/get_video.php?hl=en&video_id=prjQ7MwuCWI&l=82&t=OEgsToPDskI2GsLOn4un5ApwsZjXyl07&soff=1&sk=Tja__uRBddjozZ4rjZMhogC
          => `mktemp.flv'
Resolving youtube.com... 208.65.153.253, 208.65.153.251
Connecting to youtube.com[208.65.153.253]:80... connected.
HTTP request sent, awaiting response... 303 See Other
00:56:25 ERROR 303: See Other.

Posted by curaga on July 06 2007,08:24
I had this fully working.. But I personally haven't tested on DSL.
Those aren't single quotes, those are "backquotes" meaning the output of that command will be there:
VAR=`mktemp` VAR will have the output of mktemp
VAR='mktemp' VAR will have "mktemp"

I'll edit it for DSL needs. The link will stay the same.

Posted by jpeters on July 06 2007,08:45
With the backquotes, the variable $tmp was void.  I guess I don't understand something.
Posted by curaga on July 06 2007,09:05
I now updated it, as you see from the beginning comments ;)
Posted by jpeters on July 06 2007,09:24
Looks the same; are you sure it linked?
Posted by curaga on July 06 2007,09:47
See the
#edited to work better with DSL?

I just removed wgets -nv switch and made $tmp a stable name

Posted by jpeters on July 06 2007,10:07
Same 303 errors.  Maybe tomorrow...
Posted by curaga on July 06 2007,10:44
error 303 is supposed to be there. I get it always. Maybe busybox wget exits on that, but full wget does as it should and redirects there?
Posted by lucky13 on July 06 2007,11:58
The 303 error is there because of the php script (get_video.php) at youtube that redirects to their content subservers. You probably never get 303 errors in your browser, unless you encounter a site with broken redirection chains, because browser engines handle that in the background. With wget, probably regardless of "real" wget or busybox, you get raw output. The 303 error isn't related to the wget script, it's on youtube's server side.
Posted by jpeters on July 06 2007,16:12
Quote (curaga @ July 06 2007,06:44)
error 303 is supposed to be there. I get it always. Maybe busybox wget exits on that, but full wget does as it should and redirects there?

Bottom line is an empty file.  I've used apt-get's wget and the one that comes with gnu-utils.
Posted by u2musicmike on July 10 2007,15:59
I never got this script working but I didn't try to run it as root.  Maybe the php script isn't working anymore?  Anyway I really don't download many youtube videos so I have been using this link to download:  < http://www.kissyoutube.com/watch?v=prjQ7MwuCWI >

Does anyone know what program converts flv to divx?

Posted by curaga on July 10 2007,16:04
mencoder
Posted by curaga on July 15 2007,07:04
I just used this to get Jeff Buckley's Hallelujah. Still works fine, though this was not on DSL..
Posted by curaga on Sep. 18 2007,18:37
Okay, Youtube changed their video downloading possibilities, in an attempt to prevent downloading their videos. So here's a new ver that can still download :D

Quote
#!/bin/bash
# Script for downloading Youtube videos. Base was from nixcraft.org
# Edited by Curaga for my liking (Added ability for nearly unlimited URLs, help,
# comments, and ability to get urls from args)

case $1 in
-h | *help )
echo "Usage: ${0##*/} url1 url2.. or just ${0##*/} and type the urls"
exit 0 ;;
esac

ripurl="http://youtube.com/get_video.php?"
tmp=`mktemp`
touch $tmp.flv
mkdir -p ~/YouTube;cd ~/YouTube
url="not empty in da beginning"

until [ -z "$url" ]; do
if [ -n "$1" ] && [[ $1 = < http://* > ]]; then url=$1
elif [ -n "$1" ]; then echo "Not a valid url. http:// required."; exit 1; fi

if [ -z "$1" ]; then read -p "YouTube URL? Press enter to exit. " url; fi
if [ -z "$url" ]; then
rm $tmp
clear; echo -e "\n    Finished."
exit 0; fi

echo
echo ------------------------------------------------------------------------
echo -e "                  Now getting the page for analysis.\n"
wget -nv "$url" -O $tmp

uf=$ripurl`grep watch_fullscreen $tmp | sed "s;.*\(video_id.\+\)&title.*;\1;"`

nv=`grep 'h1 id="video_title"' $tmp | cut -d ">" -f 2 | cut -d "<" -f 1`

echo -e "\n\n  Title: $nv\n\n"

wget "$uf" -O $tmp.flv
echo "" > $tmp
newname=`echo $nv | sed 's;/;\/;g'`
mv "$tmp.flv" "$newname.flv"
echo -e "\n\n $HOME/Youtube/$newname.flv ready.\n"

shift
done

Powered by Ikonboard 3.1.2a
Ikonboard © 2001 Jarvis Entertainment Group, Inc.