I've been rethinking about the publication license of these blog posts.
Changes I made:
- Some "public" posts were turned to "CC BY-NC-SA"
- Some "all-rights-reserved" posts were turned to "CC BY-NC-SA"
At the moment there are still other public and all-rights-reserved posts.. Most of the public ones are from me, while some are from friends who told me to publish them with such license.
The all-rights-reserved ones are mostly from content I grabbed over the net and published here, where I wasn't able to contact the original authors (so credits are to themselves) and some others are e.g. posts with pictures I took myself.
So.. From now on, all these types of licenses will co-exist and each post will be tagged with the correct/relative one (hopefully).
Disclaimer: If not specified, you can assume it's "CC BY-NC-SA" by Simone "roughnecks" Canaletti
https://creativecommons.org/licenses/by-nc-sa/4.0/
You can try this script on #fediverso at irc.libera.chat, where me, cage, ndo and other friends hang out
bot: "verne", running on @wpn
SearXNG instance: https://search.woodpeckersnest.space/
Thanks to cage for the script and ndo for creating the channel o/
# © cage released under CC0, public domain
# https://creativecommons.org/publicdomain/zero/1.0/
# Date: 16-08-2024
# Version: 0.1
# Package description: do a web search using your searxng instance
# Public ones won't probably work because of "limiter"
# Authorize your channel from the partyline with:
# .chanset +searxng #your-channel
# Do a search
# .search <query> | .search paris (this query goes to default engine)
# .search +<engine> <query> | .search +wp paris (this query goes to
# wikipedia)
# .search !images paris | this query search only paris' images
# List of engines: https://docs.searxng.org/user/configured_engines.html
# tcllib is required
############## configuration directives ############################
# url of the HTTP(S) server of the search engine
set searxconfig(website_url) "https://example.com/searxng"
# serach command to trigger the search
set searxconfig(cmd) ".search"
# default search engine
set searxconfig(default_engine) "ddg"
# maximum number of search results printed
set searxconfig(max_results) 3
# time tracker file
# NB: when this script runs any file with the same name within the path in the
# working directory (depending of what is considered the working
# directory of the script) will be erased and overwritten!
set searxconfig(file_millis) "searx_millis.tmp"
# Minimum search frequency in milliseconds.
# This is the minimum time that must pass between two consecutive
# search
set searxconfig(max_freq) 30000
############## configuration ends here #############
# tcllib is required
package require csv
setudef flag searxng
if { !([info exists searxconfig(lastmillis)]) } {
set searxconfig(lastmillis) 0
}
bind pub - $searxconfig(cmd) search:searxng
proc send_message {message} {
putserv "PRIVMSG $message"
}
proc slurp_file {path} {
set fp [open $path r]
set file_data [read $fp]
close $fp
return $file_data
}
proc process_csv {csv channel} {
global searxconfig
set rows [split $csv "\n"]
set count 0
#remove the header
set rows [lrange $rows 1 [llength $rows]]
if {[llength $rows] < 1} {
send_message "$channel :Something gone wrong."
} else {
foreach row $rows {
if {$count < $searxconfig(max_results)} {
set row_splitted [csv::split $row]
set title [lindex $row_splitted 0]
set url [lindex $row_splitted 1]
send_message "$channel :$title $url"
incr count
} else {
break
}
}
}
}
proc encode {query} {
set query [regsub -all { } $query "%20"]
set query [regsub -all {&} $query "%26"]
set query [regsub -all {=} $query "%3D"]
set query [regsub -all {!} $query "%21"]
}
proc get_query_results {engine query} {
global searxconfig
set query [encode $query]
set engine [encode $engine]
set url "$searxconfig(website_url)search?q=$query&format=csv&engines=$engine"
## decomment the line below for debugging purposes
# putlog $url
return [exec curl -sS $url]
}
proc get_last_millis { } {
global searxconfig
if {[file exists $searxconfig(file_millis)]} {
set searxconfig(lastmillis) [slurp_file $searxconfig(file_millis)]
} else {
set fp [open $searxconfig(file_millis) w]
puts $fp 0
close $fp
get_last_millis
}
}
proc set_last_millis { } {
global searxconfig
set fp [open $searxconfig(file_millis) w]
puts $fp [clock milliseconds]
close $fp
}
proc search:searxng {nick host hand chan text} {
global searxconfig
if {!([channel get $chan searxng])} {
send_message "$chan :This script has not been authorized to run in this channel."
return 0
}
set millis [clock milliseconds]
get_last_millis
if { [expr $millis - $searxconfig(lastmillis)] > $searxconfig(max_freq) } {
## test antiflood superato
set_last_millis
set text_splitted [split $text " {}"]
set engine [lindex $text_splitted 0]
set text_length [llength $text_splitted]
set query [lrange $text_splitted 1 $text_length]
if {![regexp {^\+} $engine]} {
set engine $searxconfig(default_engine)
set query $text_splitted
} else {
set engine [string range $engine 1 [string length $engine]]
}
if {$query == {}} {
send_message "$chan :Missing search criteria."
} else {
set csv [get_query_results $engine $query]
process_csv $csv $chan
}
return 1;
}
send_message "$chan :Try again later."
return 0
}
putlog "SearXNG Loaded"`
gemlog
@wpn has got a new HTTP HOST for its XMPP server's..:
- ..web-based chat, powered by converse.js,
- file upload,
- MUCs' pastebin,
- password_reset/invite/registration pages.
Webchat is now located at https://xmpp.woodpeckersnest.space/conversejs - only @wpn accounts can login to it.
In other news, converse.js was recently upgraded and it's now running on the main git branch code, so you can preview the featured "cyberpunk" theme in action, which will be released "soon".
Yet another small update about gemini.
You can now browse gemini://woodpeckersnest.space
even from regular HTTP, here: https://gemini.woodpeckersnest.space/
I've applied some fixes (like) to HTML and CSS (the latter is pretty much the same used by the @wpn onboarding page, but obviously customized). As for accessibility, I think it should work well for desktop and also mobile browsers; CGIs work as well.
The proxy I used is Loxy. I also already opened an issue on their repo for a problem with query strings
, still waiting for someone to reply. Apart from that, everything checks out.
gemlog