Pages

Tuesday, April 26, 2011

python documentation generator

Sphinx is a tool that makes it easy to create intelligent and beautiful documentation, and licensed under the BSD license.

For more details click here

Thursday, April 14, 2011

RPMforge

Type : System
Operating System : Redhat,Centos

RPMFORGE is a union of different provider of package like Dag, Dries and others. RPMFORGE provide base packages with others options of compilation and a lot of media tools.

WEBSITE

http://rpmrepo.net/RPMforge

http://wiki.centos.org/AdditionalResources/Repositories/RPMForg

Wednesday, July 28, 2010

TinyMCE - Javascript WYSIWYG Editor

TinyMCE, also known as the Tiny Moxiecode Content Editor, is a platform-independent web-based JavaScript/HTML WYSIWYG editor control, released as open source software under the LGPL by Moxiecode Systems AB. It has the ability to convert HTML textarea fields or other HTML elements to editor instances. TinyMCE is designed to easily integrate with content management systems.

TinyMCE integrates with many different open source systems, such as Mambo, Joomla!, Drupal, Plone, WordPress, b2evolution, e107, phpWebSite, Umbraco and Freedomeditor.

more details

Thursday, May 6, 2010

The Web Robots Pages

Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses.

Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.

It works likes this: a robot wants to vists a Web site URL, say http://www.example.com/welcome.html. Before it does so, it firsts checks for http://www.example.com/robots.txt, and finds:

User-agent: *
Disallow: /

The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.

There are two important considerations when using /robots.txt:

* robots can ignore your /robots.txt. Especially malware robots that scan the web for security vulnerabilities, and email address harvesters used by spammers will pay no attention.
* the /robots.txt file is a publicly available file. Anyone can see what sections of your server you don't want robots to use.



On this site http://www.robotstxt.org/ you can learn more about web robots.

Saturday, April 10, 2010

Greasemonkey Add-ons for Firefox

Greasemonkey is a Firefox extension that allows you to write scripts that alter the web pages you visit. You can use it to make a web site more readable or more usable. You can fix rendering bugs that the site owner can't be bothered to fix themselves. You can alter pages so they work better with assistive technologies that speak a web page out loud or convert it to Braille. You can even automatically retrieve data from other sites to make two sites more interconnected.


step 1. Download extension: go to url,

step 2. to test greasemonkey working, Download User scripts go to url

step 3. open youtube.com and look for "download". Now you can download from youtube site.

OR

How to start: go to url

OR

How to write Greasemonkey scripts: go to url

Saturday, December 19, 2009

How to covert XML to HTML using XSLT

In this tutorial you will learn how to use XSLT to transform XML documents into other formats, like HTML.


what is XML?


XML stands for eXtensible Markup Language.
XML is designed to transport and store data.
XML is important to know, and very easy to learn.
refer:http://www.w3schools.com/xml/default.asp


what is XSLT?


XSL stands for EXtensible Stylesheet Language, and is a style sheet language for XML documents.
XSLT stands for XSL Transformations.

refer:http://www.w3schools.com/xsl/

Where to Start?
http://www.php.net/manual/en/xsl.examples.php

System Requirements?

Programming tools: PHP, XSL,XML, DOM
Server Requirements: PHP-XML, PHP-XSL ( check for module installation on your host, using simple php code: < ?php phpinfo(); ? > )
Cross platform : Yes (Safari, Firefox, Opera )

note: no space between < or > and ? in above example

Monday, October 26, 2009

Installing multiple versions of Python from Source

The first step is to ensure that all dependencies are installed. Run the following once:

sudo apt-get build-dep python2.5

That will install a bunch of dev packages. Which packages get installed will likely depend of each specific system.

As the remaining commands will need to be repeated for each version of python, I will list them once with X's in place of the version numbers. Be sure to replace the X's with the appropriate version numbers. The various versions and download links can be found on the Python download page.

wget http://python.org/ftp/python/X.X.X/Python-X.X.X.tgz

tar xvfz Python-X.X.X.tgz

cd Python-X.X.X

./configure --prefix=/opt/pythonX.X

make

sudo make install


In a couple versions I got some warnings after running make about missing dependencies for things I don't need or use, so I ignored them and everything worked fine. Of course, these need to be on my path to be useful so I created some links:

sudo ln -s /opt/pythonX.X/bin/pythonX.X /usr/bin/python-X.X

While I only needed to do the above once, I needed to install it in each version:

sudo python2.X setup.py install