FireFly Media Server (formerly mt-daapd) › Firefly Media Server Forums › Firefly Media Server › Setup Issues › Playlists and Podcasts
- This topic has 11 replies, 4 voices, and was last updated 13 years, 1 month ago by
Anonymous.
-
AuthorPosts
-
31st October 2007 at 9:53 am #1893
rajt
ParticipantI am having some issues regarding both Playlists and Podcasts. I am using version 1673.
Playlists:
I had an existing playlist to which I added 3 more songs to a list of 12. When scanning the log reported the 3 new songs as ‘bad path’ even though Winamp played them fine (the other 12 were fine). I then moved the playlist up one directory. Firefly then found 2 of the 3 new songs.
I looked at the entry for the 1 song it couldnt find and I found nothing wrong. I ran the scan another time, still no luck. I then ran it around 2 more times and then it suddenly found the 1 song, strange !Podcasts:
Not sure how to get this working correctly on my Roku. When you go to a site, say the BBC, they give you a ‘feeder’ URL for the podcast which normally ends in ‘.xml’ for example: http://downloads.bbc.co.uk/podcasts/bbc7/cbeebies/rss.xml. Putting this URL directly into a M3u file or a preset does not work on Rokus.
If you load this URL into Winamp and get the URL of the actual podcast being played (ends in ‘.mp3’) then it works. Only problem is that the ‘.mp3’ URL is only for a specific date, i.e. if the podcast is updated every day then the ‘.mp3’ URL will only play the same podcast everyday.
How do I setup a podcast as a M3U (or similar) that I can use everyday and is automatically updated with the latest podcast?
Hope this makes sense !
31st October 2007 at 12:32 pm #13256fizze
ParticipantRoKu and firefly do not handle podcasts.
You need other tools that do that for you.
There are many many many so called podcatchers out there, some offer more features, some less.As I use a NSLU2 as a music-server, I am particuliarly fond of a web-based frontend calle gregarius. It runs nicely even on the slug.
4th November 2007 at 11:09 pm #13257sonichouse
ParticipantI couldn’t get gregarius to work on my slug with cherokee, so I hacked this bashpodder script.
I then set up cron to check every 4 hours.
bash-3.2# cat /etc/crontab
SHELL=/bin/sh
PATH=/opt/bin:/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=""
HOME=/
#
Default is Empty
#
0 0-23/8 * * * root /usr/sbin/CheckDiskFull &>/dev/null
0 0 * * * root /usr/sbin/WatchDog &>/dev/null
1 * * * * root /usr/sbin/hwclock -s &>/dev/null
15 */4 * * * root /share/hdd/data/public/mp3/bashpodder.sh -pAll my podcasts get dumped into a directory, playlist is generated, and I delete old podcasts.
#!/opt/bin/bash
# By Linc 10/1/2004
# Find the latest script at http://linc.homeunix.org:8080/scripts/bashpodder
# Last revision 10/23/2004 - The Lee Colleton/Wari Wahab/Dann Washko version.
# If you use this and have made improvements or have comments
# drop me an email at linc dot fessenden at gmail dot com
# I'd appreciate it!
#VARS
PATH=/opt/bin:/sbin:/bin:/usr/sbin:/usr/bin
log=podcast.log
conf=bp.conf
playlist=podcast.m3u
numitems=3
maxdays=35
FIREFLY=192.168.100.77
FIREFLYPORT=3689
#usage function to print usage.
usage()
{
builtin echo -e >&2
"usage: $0 [-p] [-l] [-h] n -p make playlist n -l keep log trimmed n -h print this help"
exit 1;
}
#getopts for playlist and log trimming. (JCE)
while getopts plh opt
do
case "${opt}" in
p) pl=set;; # playlist switch
l) lt=set;; # log trim
h) hlp=set;; # show help exit
?) usage # unknown flag
exit 1;;
esac
done
shift $(($OPTIND - 1))
#check for help flag.
if [ "${hlp}" == "set" ]; then
usage
fi
# Quickie change to make the script more cron friendly
# Contributed by Wari Wahab
workdir=`dirname ${0}`
cd ${workdir}
# datadir is the directory you want the audio files to be put into
# I used a date string which will create a new directory every day
# Changed to iso date format as suggested by Lee Colleton!
# Updated to work on OSX by Dann Washko
# datadir=`date +%Y-%m-%d`
# SPF - set datadir to a fixed directory
datadir=podcasts
# test for existance of data directory and create if not there
if [ ! -d ${datadir} ]
then
mkdir ${datadir}
fi
# SPF - If this is the first time, touch podcast.log to shut up the grep below.
if ! [ -e ${log} ]
then
touch ${log}
fi
# Read the conf file for rss feeds you wish to grab audio from
# Used the sed line from Rodrigo Stulzer to better take care of
# non-standard rss. (JCE)
IFS_T=${IFS}
IFS='
'
I=0
while read url
do
urlcount=0;
for url in $(wget -q ${url} -O - | sed 's/<enclosure/n<enclosure/g' | grep "<enclosure " | tr ' " | sed -n 's/.*url="([^"]*)".*/1/p')
do
# limit podcasts to numitems downloaded
urlcount=$(( $urlcount + 1 ))
if ! grep "${url}" ${log} > /dev/null
then
if [ $urlcount -le $numitems ] ; then
wget -q -P ${datadir} ${url}
echo ${url} >> ${log}
((I += 1))
fi
fi
#save all urls for uptodate log file
#if -l flag is thrown
if [ "${lt}" == "set" ]; then
NEWLOG[${I}]=${url}
fi
done
done < ${conf}
IFS=${IFS_T}
#Write a new log file containing only urls in current rss feeds.
#if -l flag is thrown.
if [ "${lt}" == "set" ]; then
rm ${log}
for urllog in "${NEWLOG[@]}"; do
echo ${urllog} >> ${log}
done
fi
# SPF - Remove old files
removals=`find ${datadir} -maxdepth 1 -type f -mtime +${maxdays}|wc -l`
if [ $removals -ge 1 ]; then
echo "Removing ${removals} files"
find ${datadir} -maxdepth 1 -type f -mtime +${maxdays} -exec rm {} ;
fi
# That's it!
# Oh yeah, you probably want an m3u playlist (at least I do)
# added a switch for making the play list (JCE)
if [ "${pl}" == "set" ] ; then
ls ${datadir} | grep -v m3u > ${datadir}/${playlist}
fi
# SPF - Update Firefly by forcing a scan if files downloaded/deleted
if [ $(($I + $removals)) -ge 1 ]; then
wget --delete-after -q "http://mt-daapd:[email protected]${FIREFLY}:${FIREFLYPORT}/config-update.html?action=rescan"
fi
My bp.conf file is
http://downloads.bbc.co.uk/podcasts/fivelive/drkarl/rss.xml
http://downloads.bbc.co.uk/podcasts/worldservice/scia/rss.xml
http://downloads.bbc.co.uk/podcasts/radio4/fricomedy/rss.xml
http://downloads.bbc.co.uk/podcasts/radio2/ross/rss.xml
http://www.dailysourcecode.com/feed.xmlYou just need to ensure bash and wget are installed. YMMV
Steve
5th November 2007 at 7:36 am #13258fizze
ParticipantI couldn’t get gregarius to work on my slug with cherokee, so I hacked this bashpodder script.
Well its running fine on my slug with unslung, php-fcgi, sqlite and cherokee.
If anyone cares I can post details.
I did have to hack into the enclosurecache plugin that wasn’t doing some stuff SQL compliant…5th November 2007 at 9:03 pm #13259sonichouse
ParticipantI struggled to get my cherokee to talk to sqlite or mysql, so I gave up.
I would be interested to know what you did to fix it.
/Steve
13th November 2007 at 1:18 pm #13260fizze
ParticipantDang, I’ve been rather busy lately. I didnt forget about this, but I am actually proud of my slug, so I want to encourage others to use it’s potential. 🙂
It worked fine to install Cherokee with php-fcgi accordign to this wiki:
http://www.nslu2-linux.org/wiki/HowTo/DeployPHPWebAppUsingFastCGITo get php to work with sqlite is actually quite easy too. You need to install php-gd, and you need these:
extension=sqlite.so
extension=pdo_sqlite.so
In your php.ini.
Details on gregarius will follow. I intend to put this together properly in a blog post probably, somewhen.
13th November 2007 at 8:02 pm #13261sonichouse
ParticipantThanks, trying to configure gregarius as I write.
At least it sees sqlite as an option now.Steve
13th November 2007 at 10:34 pm #13262sonichouse
ParticipantHi Fizze,
managed to get it up and running, had to install libgd, freetype, libjpeg, libpng to resolve missing libraries.
Works well in Firefox, but IE7 fails on update (known issue I think).
Is there a way to automatically save enclosures to a directory on the server ? The enclosure cache plugin on their web site is not available.
Did you say you re-edited it ?Regards, Steve.
14th November 2007 at 10:59 am #13263fizze
ParticipantYes, I can make my version of the enclosurecache plugin available.
It had some very ugly SQL-code snippets, that needed to be fixed so Sqlite would accept them too. It seems MySQL isnt that finicky when it comes to that. 😉Just make sure you have proper permissions for the web-server-user.
The enclosurecach plugin is recursive. It is called from the webbrowser when updating, and then invokes itself through command-line-php, as a detached command. It may not be nice, but it works. 😉25th November 2007 at 8:58 am #13264fizze
ParticipantThis is my version of the enclosurecache.php.
It’s modified to be more SQL-compliant, and thus also works with sqlite. (on the slug)It’s not really nice, at it looks for the php binary in /opt/bin. Make sure the web-server user has execute permissions!
I tried to get in touch with some of the gregarius devs to update the site rep, but this proves to be difficult.
The chances are sparse, and I’ve added a few debug hooks to check permissions, amongst others. It is currently working like a charm for me, though. I’ve added a cron job that refreshes gregarius daily at 4am, and another cron job that triggers a scan in firefly at 5am. This way I can enjoy my newest podcasts during breakfast. 🙂
<?php
###############################################################################
# Gregarius - A PHP based RSS aggregator.
# Copyright (C) 2003 - 2005 Marco Bonetti
#
###############################################################################
# This program is free software and open source software; you can redistribute
# it and/or modify it under the terms of the GNU General Public License as
# published by the Free Software Foundation; either version 2 of the License,
# or (at your option) any later version.
#
# This program is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for
# more details.
#
# You should have received a copy of the GNU General Public License along
# with this program; if not, write to the Free Software Foundation, Inc.,
# 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA or visit
# http://www.gnu.org/licenses/gpl.html
#
###############################################################################
# E-mail: mbonetti at users dot sourceforge dot net
# Web page: http://sourceforge.net/projects/gregarius
#
###############################################################################
/// Name: Enclosure Cache
/// Author: Keith D. Zimmerman
/// Description: Provides a local cache of enclosures found in the items
/// Version: 0.1
/// Configuration: __enclosure_cache_config
/**
* Changes:
*
* 0.1: First release
*
*/
require_once(dirname( __FILE__ ) . "/../init.php");
rss_set_hook("rss.plugins.items.newiid", "__enclosure_cache_go_get_it");
rss_set_hook('rss.plugins.javascript','__enclosure_cache_register');
rss_set_hook('rss.plugins.ajax.exports','__enclosure_cache_appendAJAXfunction');
rss_set_hook("rss.plugins.items.enclosure", "__enclosure_cache_link");
define ('EC_DOWNLOAD_DIR_CONFIG_KEY','downloadPath');
define ('EC_DOWNLOAD_PREFIX_CONFIG_KEY','downloadHttpPrefix');
define ('EC_AUTODOWNLOAD_CONFIG_KEY','autoDownload');
define ('EC_OVERWRITE_ENC_CONFIG_KEY','overwriteEnclosure');
//configuration function
function __enclosure_cache_config() {
if (rss_plugins_is_submit()) {
rss_plugins_add_option(EC_DOWNLOAD_DIR_CONFIG_KEY, $_REQUEST, 'string');
rss_plugins_add_option(EC_DOWNLOAD_PREFIX_CONFIG_KEY, $_REQUEST, 'string');
rss_plugins_add_option(EC_AUTODOWNLOAD_CONFIG_KEY, isset( $_REQUEST ) ? $_REQUEST : "", 'string');
rss_plugins_add_option(EC_OVERWRITE_ENC_CONFIG_KEY, isset( $_REQUEST ) ? $_REQUEST : "", 'string');
return;
}
echo "n";
n";
echo "n";
echo "
nn";
echo "
nn";
echo "
n";
echo "n";
echo "
}
//hooks
function __enclosure_cache_go_get_it($params){
$newIid = $params[0];
$item = $params[1];
//if configured for autodownload and this item has an enclosure, go get it...
if (rss_plugins_get_option( EC_AUTODOWNLOAD_CONFIG_KEY ) == 1 && array_key_exists('[email protected]', $item) ) {
$enclosure = $item;
__exp__enclosure_cache( $newIid, $enclosure, 1 );
}
return $params;
}
function __enclosure_cache_register($js) {
$js[] = getPath(). RSS_PLUGINS_DIR . "/enclosurecache.js";
return $js;
}
function __enclosure_cache_appendAJAXfunction($exp) {
$exp[]='__exp__enclosure_cache';
return $exp;
}
function __enclosure_cache_link($dummy){
$id = rss_item_id();
$url = rss_item_enclosure();
$savepath = realpath( rss_plugins_get_option( EC_DOWNLOAD_DIR_CONFIG_KEY ) );
if( file_exists( $savepath . '/' . basename( $url ) ) )
echo " (LOCAL CACHE)";
else
{
$bDone = ( __exp__enclosure_cache( $id, $url, 0 ) == "$id||$url||DONE" );
if( !$bDone )
echo "__enclosure_cache_js_timer( $id, "$url" );";
?> | <a href="#" onclick="javascript: __enclosure_cache_download(, ''); return false;"
id='__ec__link'>save in cache<pre style='display: inline;' id='__ec__status'><?php
}return $dummy;
}
//end hooks//exported function
function __exp__enclosure_cache($id, $url, $start) {
if( $start == 1 )
{
if( !is_writable( rss_plugins_get_option( EC_DOWNLOAD_DIR_CONFIG_KEY ) ) )
$ret = "ERROR: No permissions to write in " . rss_plugins_get_option( EC_DOWNLOAD_DIR_CONFIG_KEY ) . "!";
else
{
# $savepath = realpath( rss_plugins_get_option( EC_DOWNLOAD_DIR_CONFIG_KEY ) );# $ret = exec('echo "'. __FILE__. '">'. $savepath .'/'.'gregarius.log');
# $ret = exec( 'echo "/opt/bin/php -f ' . __FILE__ . ' -- --id ' . $id . ' --url ' . $url . '"> '. $savepath .'/'.'gregarius.log' );
# $ret = $savepath .'/'.'gregarius.log';
system( '/opt/bin/php -f ' . __FILE__ . ' -- --id ' . $id . ' --url ' . $url . ' &' );
$ret = "Starting Download...";
# $ret = ( 'php -f ' . __FILE__ . ' -- --id ' . $id . ' --url ' . $url . ' &' );
}
}
else
{
$savepath = realpath( rss_plugins_get_option( EC_DOWNLOAD_DIR_CONFIG_KEY ) );
if( file_exists( $savepath . '/' . $id . '_error.txt' ) )
{
$error_file = fopen( $savepath . '/' . $id . '_error.txt', 'r' );
fwrite($error_file,"test!");
while( !flock( $error_file, LOCK_EX ) )
{
// wait for 100 ms
usleep(100000);
}if( file_exists( $savepath . '/' . $id . '_status.txt' ) )
{
$status_file = fopen( $savepath . '/' . $id . '_status.txt', 'r' );
$ret = fread( $status_file, 2048 /*arbitrary max*/ );
fclose( $status_file );
}
else
{
$ret = "ERROR: " . fread( $error_file, 2048 /*arbitrary max*/ );
}fclose( $error_file );
}
else
$ret = "DONE";
}
return $id . "||" . $url . "||" . $ret;
}//helper function
function __enclosure_cache_filename_mangler($item_id, $enclosure_file_name){
$enclosure_base_name = basename( $enclosure_file_name );
$exten = strchr($enclosure_base_name, ".");
return substr($enclosure_base_name, 0, -strlen($exten)) . "_" . $item_id . $exten;
}//handle command line mode !
$cline = isset($argv) && !$_REQUEST;
if($cline)
{
$forced_args = array(
array("--id", "FULL"),
array("--url","FULL") );$parser = new __enclosure_cache_arg_parser($forced_args);
$id = $parser->get_full_passed( "--id" );
$url = $parser->get_full_passed( "--url" );$savepath = realpath( rss_plugins_get_option( EC_DOWNLOAD_DIR_CONFIG_KEY ) );
# $savepath = "/tmp";
$status_file = fopen( $savepath . '/' . $id . '_status.txt', 'w' );
# $status_file = fopen( '/tmp/enc_status.txt', 'w' );
$error_file = fopen( $savepath . '/' . $id . '_error.txt', 'w' );
# fwrite($error_file, $savepath);
# fwrite($status_file, $savepath);$http = new __enclosure_cache_HTTPRequest( $url, $id, $savepath, $error_file, $status_file );
$rez = $http->Download();if( !$rez )
fwrite( $error_file, $http->errstr );//update SQL
if( $rez )
{
if( rss_plugins_get_option( EC_OVERWRITE_ENC_CONFIG_KEY ) == 1 )
{
// $sql = "update " . getTable( "item" ) . " set description=concat(description,'" .
$sql = "update " . getTable( "item" ) . " set description=description||'" .
rss_real_escape_string( "" ) .
"', enclosure='" .
rss_real_escape_string( rss_plugins_get_option( EC_DOWNLOAD_PREFIX_CONFIG_KEY ) . '/' . $http->FileName() ) .
"' where id=$id";
}
else
{
// $sql = "update " . getTable( "item" ) . " set description=concat(description,'" .
$sql = "update " . getTable( "item" ) . " set description=description||'" .
rss_real_escape_string( "FileName() . "'>Local enclosure cache link
" ) .
"' where id=$id";
}rss_query( $sql, false );
if( rss_sql_error() != 0 )
{
$rez = FALSE;
fwrite( $error_file, "SQL: $sqln" );
fwrite( $error_file, rss_sql_error_message() );
}
}fwrite( $error_file, "nn" );
fclose( $error_file );
if( $rez )
unlink( $savepath . '/' . $id . '_error.txt' );
fclose( $status_file );
unlink( $savepath . '/' . $id . '_status.txt' );rss_invalidate_cache();
}//supporting classes
//this class copied and modified from http://us3.php.net/manual/en/function.fopen.php#58099
class __enclosure_cache_HTTPRequest
{
var $_fp; // HTTP socket
var $_url; // full URL
var $_host; // HTTP host
var $_protocol; // protocol (HTTP/HTTPS)
var $_uri; // request URI
var $_port; // portvar $_http_status; //first line of response
var $errno; //error!
var $errstr;
var $_filename; //the filename
var $_savepath; //where to save downloaded files
var $_sizegotten; //how many bytes downloaded
var $_sizetotal; //how many bytes total download
var $_id; //id of the rss itemvar $_error_file; //used for flock
var $_status_file; //status reports// constructor
function __enclosure_cache_HTTPRequest($url, $id, $savepath, $errorfile, $statusfile)
{
$this->_error_file = $errorfile;
$this->_status_file = $statusfile;
$this->_savepath = $savepath;
$this->_id = $id;
$this->_url = $url;
$this->_scan_url();$this->_sizegotten = 0;
$this->_sizetotal = 0;
}// scan url
function _scan_url()
{
$this->_filename = __enclosure_cache_filename_mangler( $this->_id, $this->_url );$req = $this->_url;
$pos = strpos($req, '://');
$this->_protocol = strtolower(substr($req, 0, $pos));$req = substr($req, $pos+3);
$pos = strpos($req, '/');
if($pos === false)
$pos = strlen($req);
$host = substr($req, 0, $pos);if(strpos($host, ':') !== false)
{
list($this->_host, $this->_port) = explode(':', $host);
}
else
{
$this->_host = $host;
$this->_port = ($this->_protocol == 'https') ? 443 : 80;
}$this->_uri = substr($req, $pos);
if($this->_uri == '')
$this->_uri = '/';
}// download URL
function Download()
{
$crlf = "rn";// generate request
$req = 'GET ' . $this->_uri . ' HTTP/1.0' . $crlf
. 'Host: ' . $this->_host . $crlf
. $crlf;// fetch
$this->_fp = fsockopen(($this->_protocol == 'https' ? 'ssl://' : '') . $this->_host, $this->_port, $this->errno, $this->errstr);
if( !$this->_fp )
{
$errs = $this->errstr;
$this->errstr = "Attempting to open '" . $this->_url . "' failed:n" . $errs;
return FALSE;
}$handle = false;
$lasttick = 0;
$response = "";
fwrite($this->_fp, $req);
while(is_resource($this->_fp) && $this->_fp && !feof($this->_fp))
{
//status dump
if( ( time() - $lasttick >= 2 ) && flock( $this->_error_file, LOCK_EX ) )
{
fseek( $this->_status_file, 0 );
ftruncate( $this->_status_file, 0 );
fwrite( $this->_status_file, "Downloading: " . $this->_filename . " " );
if( $this->_sizetotal )
fwrite( $this->_status_file, sprintf( "%.1f", $this->_sizegotten / $this->_sizetotal * 100.0 ) . "% (" );
fwrite( $this->_status_file, $this->_sizegotten );
if( $this->_sizetotal )
fwrite( $this->_status_file, " of " . $this->_sizetotal . " bytes)" );
else
fwrite( $this->_status_file, " bytes of an unknown file size." );
flock( $this->_error_file, LOCK_UN );
$lasttick = time();
}$response .= fread($this->_fp, 2048);
if( $handle === false )
{
// split header and body
$pos = strpos($response, $crlf . $crlf);
if($pos !== false)
{
$header = substr($response, 0, $pos);
$body = substr($response, $pos + 2 * strlen($crlf));// parse headers
$headers = array();
$lines = explode($crlf, $header);
foreach($lines as $line)
if(($pos = strpos($line, ':')) !== false)
$headers[strtolower(trim(substr($line, 0, $pos)))] = trim(substr($line, $pos+1));
else if( !isset( $this->_http_status ) )
$this->_http_status = $line;// redirection?
if(isset($headers))
{
# $http = new __enclosure_cache_HTTPRequest($headers, $this->_savepath);
$http = new __enclosure_cache_HTTPRequest($headers, $this->_id, $this->_savepath, $this->_error_file, $this->_status_file);
fclose($this->_fp);
return $http->Download();
}
// filename? (we certainly hope so!)
else
{
if(isset($headers))
$this->_sizetotal = $headers + 0;
if(isset($headers))
{
if( preg_match( "/filenames*=s**([^'"s]+)*/i", $headers, $matches ) > 0 )
$this->_filename = __enclosure_cache_filename_mangler( $this->_id, $matches[1] );
}
}$handle = fopen( $this->_savepath . '/' . $this->_filename, "wb");
if( !$handle )
{
$this->errstr = "cannot create file: '" . $this->_savepath . '/' . $this->_filename . "'";
fclose($this->_fp);
return FALSE;
}
$this->_sizegotten = strlen( $body );
fwrite( $handle, $body );
$response = "";
}
}
else
{
$this->_sizegotten += strlen( $response );
fwrite( $handle, $response );
$response = "";
}
}if( $response != "" )
{
$handle = fopen( $this->_savepath . '/' . $this->_filename, "wb");
if( !$handle )
{
$this->errstr = "cannot create file: '" . $this->_savepath . '/' . $this->_filename . "'";
fclose($this->_fp);
return FALSE;
}
fwrite( $handle, $response );
}fclose( $handle );
fclose($this->_fp);if( preg_match( "/200s*OK/i", $this->_http_status ) != 1 )
{
$this->errstr = $this->_http_status;
return FALSE;
}return TRUE;
}function FileName()
{
return ($this->_filename);
}
}//This class from http://us2.php.net/manual/en/features.commandline.php#52475
/**********************************************
* Simple argv[] parser for CLI scripts
* Diego Mendes Rodrigues - S?o Paulo - Brazil
* diego.m.rodrigues [at] gmail [dot] com
* May/2005
**********************************************/class __enclosure_cache_arg_parser {
var $argc;
var $argv;
var $parsed;
var $force_this;function __enclosure_cache_arg_parser($force_this="") {
global $argc, $argv;
$this->argc = $argc;
$this->argv = $argv;
$this->parsed = array();array_push($this->parsed,
array($this->argv[0]) );if ( !empty($force_this) )
if ( is_array($force_this) )
$this->force_this = $force_this;//Sending parameters to $parsed
if ( $this->argc > 1 ) {
for($i=1 ; $iargc ; $i++) {
//We only have passed -xxxx
if ( substr($this->argv[$i],0,1) == "-" ) {
//Se temos -xxxx xxxx
if ( $this->argc > ($i+1) ) {
if ( substr($this->argv[$i+1],0,1) != "-" ) {
array_push($this->parsed,
array($this->argv[$i],
$this->argv[$i+1]) );
$i++;
continue;
}
}
}
//We have passed -xxxxx1 xxxxx2
array_push($this->parsed,
array($this->argv[$i]) );
}
}//Testing if all necessary parameters have been passed
$this->force();
}//Testing if one parameter have benn passed
function passed($argumento) {
for($i=0 ; $iargc ; $i++)
if ( $this->parsed[$i][0] == $argumento )
return $i;
return 0;
}//Testing if you have passed a estra argument, -xxxx1 xxxxx2
function full_passed($argumento) {
$findArg = $this->passed($argumento);
if ( $findArg )
if ( count($this->parsed[$findArg] ) > 1 )
return $findArg;
return 0;
}//Returns xxxxx2 at a " -xxxx1 xxxxx2" call
function get_full_passed($argumento) {
$findArg = $this->full_passed($argumento);if ( $findArg )
return $this->parsed[$findArg][1];return;
}//Necessary parameters to script
function force() {
if ( is_array( $this->force_this ) ) {
for($i=0 ; $iforce_this) ; $i++) {
if ( $this->force_this[$i][1] == "SIMPLE"
&& !$this->passed($this->force_this[$i][0])
)
die("nnMissing " . $this->force_this[$i][0] . "nn");if ( $this->force_this[$i][1] == "FULL"
&& !$this->full_passed($this->force_this[$i][0])
)
die("nnMissing " . $this->force_this[$i][0] ." nn");
}
}
}
}
?>25th November 2007 at 9:59 am #13265sonichouse
ParticipantThanks Fizze,
I will investigate further.
30th November 2007 at 12:14 pm #13266Anonymous
InactiveHi
I’m using a more heavily modified version of the BashPodder script on my DS 107e. I’m including it below should anyone be interested.
Features
– Keep variable amounts of podcasts per feed
– Builds m3u playlists and tells FireFly to rescan
– Modify ID3 tags on podcasts that set them badly
– Seems reasonably robust on my 107I’m very new to FireFly (4 days!) but I’m already rather interested in integrating this more tightly but I’m still figuring out how FireFly works to allow me to do that.
Cheers
Martin
#!/bin/ash
# By Linc 10/1/2004
# Find the latest script at http://linc.homeunix.org:8080/scripts/bashpodder
# Last revision 07/01/2005 - Many Contributers!
# If you use this and have made improvements or have comments
# drop me an email at linc dot fessenden at gmail dot com
# I'd appreciate it!
#
# Modified by Martin Croome - Many mods but still largly inspired by the above
# Requires curl, xsltproc, wget and taged (which coredumps too often)
# Talks to firefly if it's present
# Tested on a Synology 107e with FireFly
# Directories and files
LOCKFILE="/tmp/lock.newpodder"
DATADIR="/volume1/music/podcasts"
LOGFILE="/opt/etc/podcast.history"
PODCASTCONF="/opt/etc/podcast.conf"
TEMPDIR="/tmp"
PLAYLISTS="Playlists"
M3ULASTDAY="Podcasts (Today).m3u"
M3UALL="Podcasts (All).m3u"
# PODCASTCONF file format is made up of lines like this:
#
# podcast_url|dir|count|fixid3
# where
# podcast_url is the URL for the podcast
# dir is the directory under DATADIR where the podcast will be stored
# count is the number of "episodes" to keep. -1 keeps all of them.
# fixid3 rewrites various ID3 headers with information from the RSS
# This uses taged which barfs on some files. I use this to
# rewrite the tags on some podcasts that don't set them well.
#
# e.g. http://podcast.rtl.fr/onrefaitlemonde.xml|OnRefaitLeMonde|2|1
# http://downloads.bbc.co.uk/podcasts/radio4/fooc/rss.xml|FOOC|2|0
TEMPLOG="${TEMPDIR}/newpodder.$$.log"
TEMPXSLT="${TEMPDIR}/newpodder.$$.xsl"
# External programs. NOTE: Redefined further below for debug
TAGED="/opt/bin/taged -n"
XSLTPROC="/opt/bin/xsltproc"
WGET="/opt/bin/wget -q"
CURL="/opt/bin/curl -s"
# Parameter defaults
CATCHUP=0
VERBOSE=0
FIREFLYPASS="xxxxxx"
MAXTITLELENGTH=50
FORCE=""
LOGREDIRECT=""
QUIET=0
pdebug()
{
if [ $VERBOSE -eq "1" ]
then
if [ "X${LOGREDIRECT}" == "X" ]
then
echo "DEBUG: $*"
else
echo "DEBUG: $*" >> $LOGREDIRECT
fi
fi
}
plog()
{
if [ $QUIET -eq "0" ]
then
if [ "X${LOGREDIRECT}" == "X" ]
then
echo "`date`: $*"
else
echo "`date`: $*" >> $LOGREDIRECT
fi
fi
}
RDATE=""
dateformat()
{
RDATE=`echo $1 | awk '/(^Mon,)|(^Tue,)|(^Wed,)|(^Thu,)|(^Fri,)|(^Sat,)|(^Sun,)/ { printf("%4d %3s %2d", $4, $3, $2); }' -`
if [ ${#RDATE} -eq 0 ]
then
RDATE=$1
fi
return 0
}
cleanup()
{
pdebug "Cleaning up"
rm -f "${TEMPDIR}/newpodder.*"
}
instance_lock()
{
TEMPFILE="${TEMPDIR}/bashpodder.temp.$$"
echo $$ > $TEMPFILE ||
{
echo "Could not create lock file"
return 1
}
trap "rm -f $TEMPFILE; exit" SIGINT SIGTERM
ln $TEMPFILE $LOCKFILE 2> /dev/null &&
{
rm -f $TEMPFILE
trap "cleanup; rm -f $LOCKFILE; exit" SIGINT SIGTERM
return 0
}
kill -0 `cat $LOCKFILE` 2> /dev/null &&
{
rm -f $TEMPFILE
return 1
}
pdebug "Removing stale lock file"
rm -f $LOCKFILE
ln $TEMPFILE $LOCKFILE 2> /dev/null &&
{
rm -f $TEMPFILE
trap "cleanup; rm -f $LOCKFILE; exit" SIGINT SIGTERM
return 0
}
rm -f $TEMPFILE
trap - SIGINT SIGTERM
return 1
}
listfeeds()
{
echo "FixID3 Count Poddir"
echo "
"
while read line
do
# Ignore comment lines
if [ `expr match "$line" ' *#'` -gt 0 ]
then
continue
fi
# Split up parameters
OIFS="$IFS"
IFS='|'
set -- $(echo "$line")
IFS="$OIFS"
echo "$4 $3 $2"
done < $PODCASTCONF
}
generatem3u()
{
plog "Creating playlists"
playlistdir="${DATADIR}/${PLAYLISTS}"
if ! [ -d "${playlistdir}" ]
then
mkdir $playlistdir
fi
# Use this to generate relative paths. I don't want to cd around.
escapeddir=$(echo $DATADIR | sed 's///\//g')
# Create a playlist of the files added in the last 24 hours
# Busybox find doesn't support | in wildcard
find "$DATADIR" -mtime -1 -print -name '*' |
egrep ".*.(mp3|m4a|m4p|ogg|flac)$" |
sed "s/${escapeddir}/../" > "${playlistdir}/${M3ULASTDAY}"
# Create a playlist of all the files
find "$DATADIR" -print -name '*' |
egrep ".*.(mp3|m4a|m4p|ogg|flac)$" |
sed "s/${escapeddir}/../" > "${playlistdir}/${M3UALL}"
}
# Make script crontab friendly:
cd $(dirname $0)
#Check if another instance is already running
instance_lock || {
echo "An instance of $0 is already running"; exit 1
}
# Get options
while getopts lv:cp:m:f:L:qg o
do case "$o" in
v)
case "$OPTARG" in
1)
VERBOSE=1
;;
2)
VERBOSE=1
# Debug versions of external programs
XSLTPROC="/opt/bin/xsltproc -v"
WGET="/opt/bin/wget -v"
TAGED="/opt/bin/taged -v"
CURL="/opt/bin/curl -v"
;;
esac
;;
c)
CATCHUP=1
;;
p)
FIREFLYPASS="$OPTARG"
;;
m)
MAXTITLELENGTH="$OPTARG"
;;
l)
listfeeds
cleanup
exit 0
;;
f)
FORCE="$OPTARG"
;;
L)
LOGREDIRECT="${OPTARG}"
if ! [ -e "$LOGREDIRECT" ]
then
touch $LOGREDIRECT
fi
;;
q)
QUIET=1
;;
g)
generatem3u
exit 0
;;
[?])
cat >&2 <<ENDOFUSAGE
Usage: $0 [-cvpmlfLq]
-c Catchup. Don't download just log.
-f [poddir] Force download of a specific podcast. Regex on
podcast directory.
-g Just generate m3u files.
-l List configured podcasts.
-L [logfile] Log to file
-m [Length] Limit title length when setting ID3 title.
-p [password] Password for Firefly server.
-q Quiet. Switch off all messages
-v [1|2] Verbose. 2 levels. 2 is highest.
ENDOFUSAGE
exit 1
;;
esac
done
# Create XSLT for processing RSS feeds
cat > $TEMPXSLT <<ENDOFXSLT
ENDOFXSLT
# DATADIR is the root for podacast directories
# Check for and create DATADIR if necessary:
if test ! -d $DATADIR
then
pdebug "Creating data directory"
mkdir $DATADIR
fi
# If this is the first time, touch LOGFILE to shut up the grep below.
if ! [ -e $LOGFILE ]
then
touch $LOGFILE
fi
# Read the PODCASTCONF file.
plog "Start run"
while read line
do
# Ignore comment lines
if [ `expr match "$line" ' *#'` -gt 0 ]
then
pdebug "Comment ${line}"
continue
fi
# Split up parameters
OIFS="$IFS"
IFS='|'
set -- $(echo "$line")
IFS="$OIFS"
podcast=$1
poddir=$2
count=$3
fixid3=$4
pdebug "Podcast=${podcast} Poddir=${poddir} Cnt=${count} FixID3=${fixid3}"
# If forcing match regex
if [ "X${FORCE}" != "X" ] && [ `expr match "${poddir}" "${FORCE}"` -eq 0 ]
then
pdebug "Skip ${poddir}"
continue
fi
# count 0 means that no files will be downloaded
# count -1 means that all files will be downloaded
# count n>0 means that the n latest files will be kept or downloaded
if [ "$CATCHUP" -eq "1" ]
then
count=0
fi
# Create feed directory
feeddir="${DATADIR}/${poddir}"
pdebug "Feeddir=${feeddir}"
if test ! -d $feeddir
then
mkdir $feeddir
pdebug "Creating ${feeddir}"
fi
# Download and process RSS
# Order of stream produced by XSLT
# title
# ttl
# Then for each item
# pubDate
# description
# url
pdebug "Now parse items"
# Readstate states
# 0 - Read feedname
# 1 - Read TTL
# 5 - Read pubDate
# 6 - Read description
# 9 - Read url and process
readstate=0
$WGET $podcast -O - | $XSLTPROC $TEMPXSLT - 2> /dev/null | while read feed
do
pdebug "State ${readstate} Feed ${feed}"
# 0-4 states for header
case "$readstate" in
0)
feedname=$feed
readstate=1
continue
;;
1)
ttl=$feed
readstate=5
pdebug "Feedname=${feedname} TTL=${ttl}"
continue
;;
5)
pubDate=$feed
readstate=6
continue
;;
6)
description=$feed
readstate=9
continue
;;
9)
url=$feed
pdebug "Pubdate=${pubDate} Url=${url}"
realurl=`curl -s -I -L -w %{url_effective} --url "$url" | tail -n 1`
filename=`echo "$realurl" | awk -F / '{print $NF}' | sed -e "s/%20/ /g" -e "s/%27/'/g" -e "s/%23/#/g" | awk -F ? '{print $1}'`
filepath="${poddir}/${filename}"
# remove older files
if [ "$count" -eq "0" ]
then
if [ -e $feeddir/$filename ]
then
pdebug "Removing file ${filepath}"
rm $feeddir/$filename
fi
fi
# If file has not previously been processed
# NOTE: If the number of files downloaded is increased on a feed
# this will prevent older files being downloaded
if [ "X$FORCE" != "X" ] || ! grep "${filepath}" $LOGFILE > /dev/null
then
pdebug "${filepath} not found in log or forced"
if [ "$count" -eq "0" ]
then
pdebug "Log file ${filepath}"
echo "${filepath}" >> $TEMPLOG
else
pdebug "Download file ${filepath}"
# Try to resume if we can
if { $WGET -t 1 -N -c $realurl -O $feeddir/$filename || $WGET -t 1 -N $realurl -O $feeddir/$filename; } && [ -e ${feeddir}/${filename} ]
then
plog "Downloaded ${filepath}"
echo "${filepath}" >> $TEMPLOG
# Now check if we need to fix the ID3 tag
if [ "$fixid3" -eq "1" ]
then
dateformat "${pubDate}"
newtitle="${feedname} ${RDATE}"
if [ "${#newtitle}" -gt "$MAXTITLELENGTH" ]
then
titlelength=`expr $MAXTITLELENGTH - ${#RDATE} - 1`
newtitle="`expr substr "$feedname" 1 $titlelength` ${RDATE}"
fi
pdebug "Change tags title=${newtitle}"
# Note: Update to ID3v2 to avoid ID3v1 core dump with tags longer than 30 chars
# -u & -2 does not seem to work
$TAGED -u -2 -A "${feedname}" -t "${newtitle}" -g "Podcast" -c "${description}" $feeddir/$filename || plog "ERROR: ${feedname} ID3 Edit failed"
fi
else
plog "ERROR: Failed to download ${filepath}"
fi
fi
fi
# Count the first n files that exist
if [ "$count" -gt "0" ] && [ -e ${feeddir}/${filename} ]
then
count=`expr $count - 1`
pdebug "Count = ${count}"
fi
readstate=5
continue
;;
esac
done
done < $PODCASTCONF
# Move dynamically created log file to permanent log file:
pdebug "Processing log file"
cat $LOGFILE >> $TEMPLOG
sort $TEMPLOG | uniq > $LOGFILE
rm $TEMPLOG
# Generate playlists
generatem3u
# Force firefly to rescan
pdebug "Asking FireFly to rescan"
$WGET --delete-after "http://localhost:${FIREFLYPASS}@localhost:3689/config-update.html?action=rescan" || plog "ERROR: Unable to notify FireFly"
#Release lock
cleanup
plog "Run finished"
rm -f $LOCKFILE
[/code]
-
AuthorPosts
- The forum ‘Setup Issues’ is closed to new topics and replies.