Symfony on shared host 1and1

Clear Cache of local install
symfony freeze

in web/.htaccess

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
Options +FollowSymLinks +ExecCGI
 
<IfModule mod_rewrite.c>
  RewriteEngine On
 
  AddType x-mapp-php5 .php
  AddHandler x-mapp-php5 .php
 
  # UNCOMMENT the following line, if you are having trouble getting no_script_name to work
  #RewriteBase /

  # we skip all files with .something
  RewriteCond %{REQUEST_URI} \..+$
  RewriteCond %{REQUEST_URI} !\.html$
  RewriteRule .* - [L]
 
  # we check if the .html version is here (caching)
  RewriteRule ^$ index.html [QSA]
  RewriteRule ^([^.]+)$ $1.html [QSA]
  RewriteCond %{REQUEST_FILENAME} !-f
 
  # no, so we redirect to our front web controller
  RewriteRule ^(.*)$ index.php [QSA,L]
</IfModule>
 
# big crash from our front web controller
ErrorDocument 500 "<h2>Application error</h2>symfony application failed to start properly"

in web/php.ini

1
2
3
magic_quotes_gpc = off
magic_quotes_runtime = off
magic_quotes_sybase = off

Do the switch in databases.yml and propel.ini



the YahooAPI response
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
Array
(
  [ResultSet] => Array
  (
    [totalResultsAvailable] => 266933
    [totalResultsReturned] => 1
    [firstResultPosition] => 1
    [Result] => Array
      (
      [Title] => madonna 118
      [Summary] => Picture 118 of 184
      [Url] => http://www.celebritypicturesarchive.com/pictures/m/madonna/madonna-118.jpg
      [ClickUrl] => http://www.celebritypicturesarchive.com/pictures/m/madonna/madonna-118.jpg
      [RefererUrl] => http://www.celebritypicturesarchive.com/pgs/m/Madonna/Madonna%20picture_118.htm
      [FileSize] => 40209
      [FileFormat] => jpeg
      [Height] => 700
      [Width] => 473
      [Thumbnail] => Array
      (
        [Url] => http://scd.mm-b1.yimg.com/image/500892420
        [Height] => 130
        [Width] => 87
      )
    )
  )
)


Get the backlinks of your site through Yahoo Api

Update (feb 2012): Yahoo site explorer has been closed due to the cooperation of Bing and Yahoo search. The service has been replaced by bing webmastertool API. More details on how to create a BING WMT request here.

Caution:There seems to be a limitation of 1000 BLs per domain && it seems that you can only see the backlinks from the site that has been verified in Bing Webmaster Tool: a real turn off to the API…

Through the study of the backlinks of your site or the one of your competitor, you can tell many things:

  • Who gives you natural backlinks,
  • which kind of sites do they have,
  • what demografics the web master belong to…

But to make a thorough study over time you need to start somewhere and scrap your (competitors) backlinks;

You can do it by scraping yahoo site explorer or in a more elegant way through Yahoo API. Lets try the API solution http://developer.yahoo.com/search/siteexplorer/V1/inlinkData.html

Set the variables

1
2
3
4
5
6
7
8
9
10
11
<?php
 
$api_service_url = "http://search.yahooapis.com/SiteExplorerService/V1/inlinkData";
$apiid = "Your Key";
$query  = "languagekompis.com";
$entire_site  = "";   // "1" to provide results for the entire site
$omit_inlinks = "domain";
$linksperrequest = 10;   // 100 is max value
$startposition = 1;
 
$request_url = sprintf("%s?appid=%s&query=%s&entire_site=%s&omit_inlinks=%s&output=php", $api_service_url, $apiid, urlencode($query), $entire_site, $omit_inlinks);

unserialize() the Yahoo response into an array $data["ResultSet"]["Result"]

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
$currentpos = 0;
while ($currentpos++ >= 0) {
 
   // Formating API request url
   $requrl = sprintf("%s&start=%s&results=%s", $request_url, ($currentpos-1)*$linksperrequest+$startposition, $linksperrequest);
 
   // Handle errors
   if (($content = file_get_contents($requrl)) === FALSE ) {
   echo "HTTP error: $requrl
";
   exit;
   }
 
   // Unserialize data from response
   $data = unserialize($content);
 
   if (!array_key_exists("ResultSet", $data)) {
   echo "Error: Bad response from server
";
   exit;
   }
 
   // Extracting each link from array
   for ($i=0; $i<sizeof($data["ResultSet"]["Result"]); $i++) {
      $url = $data["ResultSet"]["Result"][$i]["Url"];
      $title = $data["ResultSet"]["Result"][$i]["Title"];
 
      echo '<p><a href="'.$url.'">'.$title.'</a></p>';
 
 
   }
 
   // End loop if no more results
   if (sizeof($data["ResultSet"]["Result"]) < $linksperrequest) break;
}
 
?>

done. Get the title and more from the yahoo response! Yahoo api



Why do I want to be a SEO Specialist?

5 reasons why I want to be an SEO Expert:

  1. For any Internet Business, a good SEO Strategy is the key success factor. SEO is life. No matter how good the business model or the application are, the site will not perform, buried in the SERP.
  2. Being on top of SEO requires the willingness to continuously discover and adapt. The fast pace of SEO world and the Real Time Optimization offer exciting new challenges.
  3. SEO Expert is at a very strategic position. At the crossing of all departments, the SEO expert is in the driver seat and is accountable for the success of their strategy.
  4. The SEO needs to cooperate with people across the organization so its traffic acquisition strategy is implemented with success.
  5. Finally, there are no universities where you can learn SEO: It is learning by doing and the firm is certainly the best place to develop these skills.

6 reasons why I should be a SEO specialist



Why should I be an SEO Specialist?

6 reasons why I could be an SEO Specialist.

  • I have a Master of Sc. in computer network which I believe can give me the analytical skills required for this position.
  • I have for a year now, closely monitored the analytics of my site, proposed bounce rate analysis, carried out the “on site” optimization and suggested new keywords to target.
  • In 2008, I developed a language exchange community from scratch. I used basic “on site” optimization techniques. Today the site ranks for competitive keywords and the community counts more than 1800 members
    Through this project, I also manage a small team of volunteers that help me develop this project.
    Daily, I use tools such as Google analytics, site catalyst, Google Webmaster Tools, Semrush to analyze web traffic and fine tune my site.
  • Business minded? My freelance consultancy experience taught the basic rules of running a business: Respecting my customers and their money. It taught me that they were no shortcuts, and that it was better to chose experience over money. I am also graduated of an MBA specialized in market finance…
  • I am entrepreneurial and will gladly share some example of the project I have been working on that should benefit the company.


5 reasons why I want to be an SEO Expert



Link exchange and X-Robots-Tag

or how not to give a link in exchange of a link.

In the HTTP header, we can set the X-Robots-Tag :

header("X-Robots-Tag: noindex", true);

will prevent search engines from showing the file.

However, you could prevent search engines from following the links on those pages, by doing the following:

 header("X-Robots-Tag: nofollow", true);

The links on the page will appear as followed when they are not followed.



Number of External Links in a page

It is sometimes interesting to know how many External links a page has.
Parsing the page Urls and taking away all those containing the domain name will give us an fair idea of the amount of External links.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
public function getExternalLinks( $page_content, $domain ){
  // Get the urls
  $urls = UrlTools::get_urls( $page_content, $domain );
 
  // preg match links with domain, domain = auto123.com
  $pattern = '/'.$domain.'/';
 
  foreach($urls as $id=>$raw_url) {
    if(preg_match($pattern, $raw_url)) { $raw_url = '';} 
    if(empty($urls[$id])OR empty($raw_url)) { unset($urls[$id]); } //supprime les cases vides
    $externalLinks[] = $raw_url;
  }
  Return $externalLinks
}

The get_urls( $page_content, $domain ) should only return absolute urls.
Return $externalLinks returns all external urls. A count($externalLinks); should give the number of external links



symfony on wamp

http://www.6ma.fr/tuto/symfony+avec+wamp-456



symfony propel-build-model wamp error due to XSL extension

Description:   I was trying to run the symfony PHP framework. I tried to build the model using command
php symfony propel:build-model.

And it gave error Could not perform XSLT transformation. Make sure PHP has been compiled/configured to support XSLT.

Environment: Wamp server 2.0, PHP 5.2.5, Apache 2.2.6.

Reason: PHP-XSL extension is not enabled. By default, Wamp server doesnt enable php-xsl php extension. and php-xsl is required.

Solution: Enable php-xsl extension.

left click on WAMP’s tray icon and than in PHP>PHP extensions select php-xsl and enable it.But there is one more php.ini file, which WAMP won’t change, we need to do it by hand, open: C:\wamp\bin\php\php5.2.5\php.ini and remove “;” from the line xsl.dll



Install the pear packages you ll need for your crawler

Once PEAR is installed on your server.

Install Net_URL 2:

cmd > pear install Net_URL2-0.3.0

Install HTTP_Request2-0.5.1

cmd > pear install HTTP_Request2-0.5.1



RECENTLY BLOGGED ABT

TWEETS

  • My twitter:

Test of NS:

I will try to insert different elements into it:

Broken TWiTTs :(

  •  

AUTHOR

  • Hervé Le TurduMy name is Herve Le Turdu.
    I am a 32 years old french* man with an inquiring mind and an enthusiastic personality. Get to know me better !

    *Pardon my English tainted with a strong accent and of many mistakes

Projects & Work

LINKS - BeCause they deserve it!

I Love SEOgadget ----------------------------------------------------
Testing stuff:
http://leturdu.com/page/3/

Test 1 // show post ID: 175

Test 2 // show category page: Uncategorized