Page MenuHomeSolus

D268.id554.diff
No OneTemporary

D268.id554.diff

Index: Makefile
===================================================================
--- /dev/null
+++ Makefile
@@ -0,0 +1 @@
+include ../Makefile.common
Index: package.yml
===================================================================
--- /dev/null
+++ package.yml
@@ -0,0 +1,18 @@
+name : perl-www-robotrules
+version : 6.02
+release : 1
+source :
+ - https://cpan.metacpan.org/authors/id/G/GA/GAAS/WWW-RobotRules-6.02.tar.gz : 46b502e7a288d559429891eeb5d979461dd3ecc6a5c491ead85d165b6e03a51e
+license : Artistic-1.0-Perl
+component : programming.perl
+summary : WWW::RobotRules - database of robots.txt-derived permissions
+description: |
+ This module parses /robots.txt files as specified in "A Standard for Robot Exclusion", at <http://www.robotstxt.org/wc/norobots.html> Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.
+rundeps :
+ - perl-uri
+setup : |
+ %perl_setup
+build : |
+ %perl_build
+install : |
+ %perl_install
Index: pspec_x86_64.xml
===================================================================
--- /dev/null
+++ pspec_x86_64.xml
@@ -0,0 +1,37 @@
+<PISI>
+ <Source>
+ <Name>perl-www-robotrules</Name>
+ <Packager>
+ <Name>Joey Riches</Name>
+ <Email>josephriches@gmail.com</Email>
+ </Packager>
+ <License>Artistic-1.0-Perl</License>
+ <PartOf>programming.perl</PartOf>
+ <Summary xml:lang="en">WWW::RobotRules - database of robots.txt-derived permissions</Summary>
+ <Description xml:lang="en">This module parses /robots.txt files as specified in &quot;A Standard for Robot Exclusion&quot;, at &lt;http://www.robotstxt.org/wc/norobots.html&gt; Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.
+</Description>
+ <Archive type="binary" sha1sum="79eb0752a961b8e0d15c77d298c97498fbc89c5a">https://solus-project.com/sources/README.Solus</Archive>
+ </Source>
+ <Package>
+ <Name>perl-www-robotrules</Name>
+ <Summary xml:lang="en">WWW::RobotRules - database of robots.txt-derived permissions</Summary>
+ <Description xml:lang="en">This module parses /robots.txt files as specified in &quot;A Standard for Robot Exclusion&quot;, at &lt;http://www.robotstxt.org/wc/norobots.html&gt; Webmasters can use the /robots.txt file to forbid conforming robots from accessing parts of their web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. The same WWW::RobotRules object can be used for one or more parsed /robots.txt files on any number of hosts.
+</Description>
+ <PartOf>programming.perl</PartOf>
+ <Files>
+ <Path fileType="library">/usr/lib/perl5/vendor_perl/5.24.1/WWW/RobotRules.pm</Path>
+ <Path fileType="library">/usr/lib/perl5/vendor_perl/5.24.1/WWW/RobotRules/AnyDBM_File.pm</Path>
+ <Path fileType="library">/usr/lib/perl5/vendor_perl/5.24.1/x86_64-linux-thread-multi/auto/WWW/RobotRules/.packlist</Path>
+ <Path fileType="man">/usr/share/man</Path>
+ </Files>
+ </Package>
+ <History>
+ <Update release="1">
+ <Date>2017-05-29</Date>
+ <Version>6.02</Version>
+ <Comment>Packaging update</Comment>
+ <Name>Joey Riches</Name>
+ <Email>josephriches@gmail.com</Email>
+ </Update>
+ </History>
+</PISI>
\ No newline at end of file

File Metadata

Mime Type
text/plain
Expires
May 30 2023, 8:53 AM (10 w, 3 d ago)
Storage Engine
blob
Storage Format
Raw Data
Storage Handle
5842749
Default Alt Text
D268.id554.diff (3 KB)

Event Timeline