We are talking about displaying static content, no running a dynamic page.

I scrape lots of content from major retailers to present images for product inventory based on UPC scanned inputs.

For example: target.com is easy, in PHP it is as simple as...

$sPage = @file_get_contents( $aSite[ 'url' ] ) ;

But for walmart.com you have to convince it your server is a browser with some agent spoofing...

$sAgent = 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:44.0) Gecko/20100101 Firefox/44.0' ;

$hCURL = curl_init() ;

curl_setopt( $hCURL, CURLOPT_USERAGENT, $sAgent ) ;
curl_setopt( $hCURL, CURLOPT_URL, $sURL ) ;
curl_setopt( $hCURL, CURLOPT_FAILONERROR, true ) ;
curl_setopt( $hCURL, CURLOPT_FOLLOWLOCATION, true ) ;
curl_setopt( $hCURL, CURLOPT_AUTOREFERER, true ) ;
curl_setopt( $hCURL, CURLOPT_RETURNTRANSFER,true ) ;
curl_setopt( $hCURL, CURLOPT_TIMEOUT, 10 ) ;

$sHTML = curl_exec( $hCURL ) ;

As state,

shows a student their faculty's office hours and office location.



On 03/16/2016 11:55 AM, Henrik Rützou wrote:
Rob

so how do you handle lets say jquery in different running i different
version
from the main page and the 'imported' page in a div?

How do you handle javascript that access the DOM by ID if the imported
page in the div uses the same ID's as the main page?

On Wed, Mar 16, 2016 at 5:51 PM, Rob <rob.couch@xxxxxxxxxxxxxx> wrote:

I agree with Nathan.

Simply get the contents of the page and display in a DIV or any other
suitable control of your choosing.

I do it every day with out a problem. Well, sometimes I have to use
CURL for those sites that attempt to prevent crawling. If you need code to
make your server request to look like it is a browser, just let me know.

Happy Coding,
Rob

On 03/16/2016 09:58 AM, Nathan Andelin wrote:

Mike,

Last I checked, "frame-busting hacks" like x-frames were not part of PCI.
You probably have good grounds for ignoring your latest "PCI scan", in
that
regard.

iframes provide useful functionality. clickJacking is a problem with
malicious sites. Does anyone have cause to view your web applications as
being malicious?

However, if you really do want to provide content from multiple sites
without using iframes, the idea of using GETURI (or HTTPAPI) to retrieve
"content" from "foreign" sites and passing it through to your users seems
like the best alternative. It will require some redesign and rework of any
applications that currently use iframes.

--
Your Out-Source IT Department,
Rob Couch
IT Serenity
214 682 7638
Skype: itserenity


--
This is the Web Enabling the IBM i (AS/400 and iSeries) (WEB400) mailing
list
To post a message email: WEB400@xxxxxxxxxxxx
To subscribe, unsubscribe, or change list options,
visit: http://lists.midrange.com/mailman/listinfo/web400
or email: WEB400-request@xxxxxxxxxxxx
Before posting, please take a moment to review the archives
at http://archive.midrange.com/web400.





As an Amazon Associate we earn from qualifying purchases.

This thread ...

Follow-Ups:
Replies:

Follow On AppleNews
Return to Archive home page | Return to MIDRANGE.COM home page

This mailing list archive is Copyright 1997-2024 by midrange.com and David Gibbs as a compilation work. Use of the archive is restricted to research of a business or technical nature. Any other uses are prohibited. Full details are available on our policy page. If you have questions about this, please contact [javascript protected email address].

Operating expenses for this site are earned using the Amazon Associate program and Google Adsense.