CodeIgniter: no direct script access allowed
I have a Lumen (Laravel) project inside my CodeIgniter project as a subdirectory, when I am trying to access the Lumen project, CodeIgniter gives me this message:
and here is the .htaccess file:
RewriteEngine On RewriteBase /hosting/instances/1111 RewriteCond % ^system.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond % ^application.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond % ^files.* RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond % !-f RewriteCond % !-d RewriteRule ^(.*)$ index.php?/$1 [L] RewriteCond % /lumen-api/^(.*)$ RewriteRule /lumen-api/$1 [L,QSA]
in the lumen-api I have this prefix for the routing v2/api , so an example for the login route should be as follow:
when I hit this URL it will always give me ‘no direct access allowed’.
Why are you trying to use 2 separate PHP frameworks? I’m not surprised you are having issues doing this lol
1 Answer 1
This is nothing to do with your .htaccess file or the structure of the application folder. Each CodeIgniter file, except for the file initiating CodeIgniter, usually starts with the lines:
This stops unwanted users accessing your files.
In your case, because of the other framework, you are probably not initialising CodeIgniter properly. Normally BASEPATH is defined in the public/index.php file that starts CodeIgniter.
BASEPATH is a reserved constant in CodeIgniter that defines where the CodeIgniter system is installed. The system folder is separate to the application folder as multiple applications can share the same CodeIgniter system, without having to install the framework each time.
It sound like you have to find a way to initialise CodeIgniter from within your other framework. Have a look at a getting started with CodeIgniter tutorial, specifically focusing on how to set up the main index.php file.
Some background information:
- You can see all the reserved constants here: https://codeigniter.com/userguide2/general/reserved_names.html
- There is more chat about if (!defined(‘BASEPATH’)) on the CodeIgniter forums: https://forum.codeigniter.com/post-348729.html
- index.php can be renamed but it will be in the a public folder and probably not in the application folder. It defines the location of the system folder, application folder, development environment etc. It then processes a variety of things before starting the framework with require_once BASEPATH . ‘core/CodeIgniter.php’; as its last line of code. There is a lot of chat online about how to hide index.php in the URL, which makes finding other information about it difficult. This is where .htaccess often gets mentioned in relation to CodeIgniter.
How do I configure IIS to allow access to network resources for PHP scripts?
I am currently working on a PHP front-end that joins together a series of applications running on separate servers; many of these applications generate files that I need access to, but these files (for various reasons) reside on their parent servers. If I, from the command line, issue a bit of script such as:
3 Answers 3
As you guessed this is a permissions problem. the user that the IIS worker process is running as is a local account on the machine (most likely IUSER_ ), and that user isn’t authenticated (nor does the acocunt even exist!) on the other machines you’re trying to browse via UNC
Just as a test, you can go into IIS manager and change what the IIS service is running as for that website. I’m using terminology from Server 2003/IIS 6 though because I don’t have a Server 2008 box handy right now. If you poke around in the IIS 7 manager you should be able to find where you can set the user that the worker process runs as.
Your PHP application runs using a service account, and this is the user account whose credentials are used to access the network resources. The default for IIS is to use a local account of the web server for this, and that account doesn’t have permissions to access network resources (because it’s a local account).
You should configure the IIS application pool for your web site to run using a domain user account, and then give that user account the appropriate permissions on the network share.
If you don’t have an Active Directory domain, or those two servers aren’t member of it, things get a little more tricky, but it can be done anyway by creating two user accounts with the same usernames and passwords on both servers.
Prevent external access to PHP scripts but allow AJAX
I’ve read a lot about .htaccess rules, checking headers, using encryption etc.. but I haven’t found exactly the answer I’m after. I know that assuming the server is set up right, you can’t access my precious PHP scripts with AJAX. I tried checking if an access variable was defined which disallowed address bar access but also blocked my AJAX requests. If I have some PHP scripts that I use for AJAX calls, is there a way that I can prevent address bar access, PHP POST (cURL etc) as well as AJAX from outside my domain (assumed via cross-domain access restrictions) ?
@paranoid-android No, you should be protecting your AJAX requests via some kind of authentication. «AJAX» doesn’t imply «unauthenticated».
Anything is possible, you could listen for anything that is’nt an ajax request, and display a message when someone tries to directly access the PHP file, but headers are easily spoofed, so that’s not really secure. You could use .htaccess to block access from any IP that is’nt your own, or you could use PHP to do the same etc.
4 Answers 4
There is NO way absolutely to safely/reliably identify which part of the browser the request comes from — address bar, AJAX. There’s a way to identify what is sending though browser/curl/etc via User-Agent header (but not reliably)
A quick but a lot less reliable solution would be to check for the following header. Most browsers attach it with AJAX calls. Be sure to thoroughly look into it, and implement.
X-Requested-With: XMLHttpRequest
NOTE: Do not trust the client if the resource is cruicial. You are better off implementing some other means of access filtering. Remember, any one can fake headers!
You can check whether the request isn’t an Ajax request and forbid it, but it’s not really safe due to the fact that the headers can be manipulated.
What you can do is to block every IP except the IP which is allowed to access those files.
What can do either is do implement a kind of authentication, where external applications have to send credentials to your script and the scripts checks if the client is valid.
Many ways, but they’re all not really the best ways to achieve maximum security.
I do not know definitely. However – indirectly, you can do this. Pass a unique and constantly changing parameter (GET or POST) that only you have access to as proof of the origin. If the request lacks this unique variable, then its not from you. Think outside the box on this one. Could be anything you want, here are some ideas.
1) pass the result of a mathematical equation as proof of origin. Something that you can programmatically predict, yet not obvious to prying header hackers. i.e cos($dayOfYear) or even better base64_encode(base64_encode(cos($dayOfYear))) .
2) store a unique key in a database that changes every time someone access the page. Then pass that key along with the request, and do some checks on the end page, if they dont match up to the database key, you’ve found the peeping tom. (note there will be logic involved for making sure the key hasn’t changed in between transmission of requests)
How can I deny direct access to a directory, but allowing it from a php script?
I’m developing a web application and I need to save and show images and pdf documents. I wanted to deny direct access to the images and documents and to its container file. I mean, that when someone try to acess via url to the folder, should receive a 403-forbidden error. For this I created a .htaccess file inside the folder like this:
Order deny,allow Deny from all
header('content-type: application/pdf'); readfile('../../../files/'.$document);
But when I try to access the images using ‘ my access is denied and I receive a http status code 403 forbidden. How can I access the images using my web application, but denying the direct access to the images? Also I would like to know why I can access the pdf documents, but I can’t access the images.
create a script which will verify that user is logged in and has the right permissions, and serve all files using that script. (aka, create file called file.php, and pass it document id. Script will verify user’s access levels, and read file to use. Your img tag will need to be src=’../file.php?img=12′ or something like that)
2 Answers 2
I would use a handler for either of these file types.
Place the files outside of the web accessible file system. i.e. If your web root is /var/www/html then create /var/www/files/ directory and store all your files in there.
$file_id = intval($_REQUEST['file_id']); $sql = sprintf("SELECT * FROM files WHERE file_id=%d",$file_id); $query = $mysqli->query($sql); $file = $result->fetch_assoc() // add business logic for // if $user_id is allowed to view $file_id if (preg_match("/\.pdf$/i",$file['filename'])) < header('content-type: application/pdf'); >else if (preg_match("/\.(jpg|gif|png)$/i",$file['filename'])) < header('content-type: application/pdf'); >else < die("Unknown file type"); >$full_path = sprintf("/var/www/files/%s",$file['filename']); readfile($full_path);
This would allow you to use your application logic to determine which files should be accessed by a user, record the access and keep them out of the web accessible directory.
So instead of using something like this
'
I would suggest using a syntax similar to this for handling images
'
and a link like this for downloading PDFs
'
It should be pretty straight forward assuming your have a database of PDFs and images. Something simple like this.
CREATE TABLE `files` ( `file_id` int(11) unsigned NOT NULL AUTO_INCREMENT, `file_name` varchar(255) NOT NULL DEFAULT '', PRIMARY KEY (`file_id`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1;