Menu
Menu
DaniWeb
Log In
Sign Up
Read
Contribute
Meet
Search
Search
About 1,000 results for
to-file
- Page 1
Re: Cannot run exe from asp.net
Programming
Web Development
13 Hours Ago
by Neil_brown001
… server You can technically make the server launch an .exe
file
like Notepad, but: It will run in the background on… appear, it could be an HTML/CSS issue, or the
file
isn’t loading. Make sure the app is being served…
Re: Create Infinite Loop In Bash
Hardware and Software
Linux and Unix
13 Hours Ago
by asadalikhan
while true; do echo "Running..."; sleep 60; done This runs forever, printing "Running..." every 60 seconds. No script
file
needed, just drop it in the terminal and you're good.
Re: Cannot run exe from asp.net
Programming
Web Development
5 Days Ago
by Reverend Jim
>No, Javascript cannot run/start executables on the client machine. Technically correct but there are ways around it. For example, save a
file
in a special folder on the target computer, which has a folder watch on that folder. The watching task could then trigger a local task.
Re: How to open an Excel Document in VB.NET
Programming
Software Development
6 Days Ago
by Pelorus_1
A Microsoft Excel document can be opened in VB.NET using the Microsoft.Office.Interop.Excel library. Using Workbooks.Open(), open the workbook, and then display the Excel window. Reference the Excel COM object in your project.
What’s Best Way to Clear File Contents Without Deleting the File in Linux
Hardware and Software
Linux and Unix
2 Weeks Ago
by adildixi
… DedicatedCore that explains different ways to shorten or empty a
file
in Linux without deleting it altogether. If anyone else is…
Re: What’s Best Way to Clear File Contents Without Deleting the File in Linux
Hardware and Software
Linux and Unix
2 Weeks Ago
by Reverend Jim
I'm confused. Since the log
file
is recreated automatically what is the problem with just deleting it? But if you want to shorten it you could always open it as write (but not append) and write a null string to it.
Re: What’s Best Way to Clear File Contents Without Deleting the File in Linux
Hardware and Software
Linux and Unix
2 Weeks Ago
by Salem
I guess most people just use https://linuxconfig.org/logrotate Set it and forget it. There's no need to resort to random per
file
hackery (or quackery).
Re: How to open an Excel Document in VB.NET
Programming
Software Development
1 Week Ago
by PitSterw
I've worked on something similar and found that using Microsoft.Office.Interop.Excel lets you open the
file
and loop through cells easily.
Re: Google Search Console -> Page indexing
Digital Media
Digital Marketing
Search Engine Strategies
2 Weeks Ago
by ashleydent4u
…'re on the right track already, and your robots.txt
file
actually looks okay at first glance. However, if 1.13K…
Re: Is linux your daily driver?
Hardware and Software
Linux and Unix
2 Weeks Ago
by Reverend Jim
… bonus, once I was done she also understood fragmentation and
file
pointers as well as the difference between a full and…
Re: Page Crawling Issue
Digital Media
Digital Marketing
1 Week Ago
by jonathannweyer
If SEMRush is only crawling a few pages, it might be due to the crawl settings or limits in your SEMRush account. Check if there are any crawl restrictions set up and make sure your robots.txt
file
isn’t blocking any important pages.
Re: Cannot run exe from asp.net
Programming
Web Development
1 Week Ago
by lennyli
> Are you sure IIS is configured to allow running external scripts? The document folder and asp
file
has security permission set to ALL rights for 'everyone'. In IIS, under handler mappings for .asp files, under request restriction/access, script was chosen (not execute) for feature permissions, all 'read' 'script' 'execute' are chosen
Re: Question/Answering over SQL Data Using LangGraph Framework
Programming
Computer Science
2 Weeks Ago
by Pelorus_1
Through its combination of natural language processing and structured query generation, LangGraph makes interfacing with databases and extracting insights over SQL data easier than ever.
Re: How to open an Excel Document in VB.NET
Programming
Software Development
1 Week Ago
by JamesMichaelm
I see it's been a while since you posted, but I'm curious if anyone here has tried using the Open XML SDK instead of Interop or OleDb for reading Excel files. I found it faster and doesn't require Excel to be installed, though it can be more complex for writing. Wondering how others handle big Excel files or ones with tricky formatting?
python programming issue
Programming
2 Months Ago
by Ccrobinson001
… = hook(**hook_input['kwargs']) ~~~~^^^^^^^^^^^^^^^^^^^^^^^^
File
"C:\Program Files\WindowsApps\…self._get_build_requires(config_settings, requirements=[]) ~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File
"C:\Users\Corne\AppData…
Re: python programming issue
Programming
2 Months Ago
by Pebble94464
…? pip install --upgrade pip setuptools I recreated your requirements.txt
file
based on the output that you provided. Please see below…
how to get back visual basic 6 project again on coding again
Programming
2 Months Ago
by Syed Zubair Ali
… entire project has been collapsed. I have exe back up
file
to run this project and working well, however, i need…
Re: how to get back visual basic 6 project again on coding again
Programming
2 Months Ago
by Salem
… be a cheap option. > I have exe back up
file
to run this project and working well I'm going…
Re: how to get back visual basic 6 project again on coding again
Programming
2 Months Ago
by Shajjad_1
To get back to coding a Visual Basic 6 project, locate your project files (e.g., .vbp, .frm, .bas). Ensure you have Visual Basic 6.0 installed on your system, as modern IDEs don't support VB6. Open the project
file
in VB6 IDE, and you can resume editing and coding.
Re: Google Search Console -> Page indexing
Digital Media
Digital Marketing
Search Engine Strategies
2 Months Ago
by Olu_610
Errors in robots.txt
file
settings and sitemap failures frequently trigger indexing issues. Verify if the robots.txt permits Googlebot to explore all pages and make sure your sitemap is fully updated with proper submission to Search Console.
Re: How would we poison AI web crawls?
Hardware and Software
Information Security
2 Months Ago
by Dani
…is that good bots respect your robots.txt
file
, which dictates which part of your site …crawled, while bad bots tend to ignore this
file
. However, that does not mean it's…seems that the AI bots are ignoring this
file
. That is completely untrue. Bad bots tend …to ignore the robots.txt
file
, but, as mentioned, that just means there…
Re: How would we poison AI web crawls?
Hardware and Software
Information Security
2 Months Ago
by Dani
… get OpenAI to respect their robots.txt
file
either have a syntax error in their
file
, and/or aren't naming the…
Re: How to Implement Lazy Loading for Faster Web Portals
Programming
Web Development
2 Months Ago
by Dani
… rendering of the HTML page. Use `defer` if your Javascript
file
requires the HTML DOM to be ready, or relies on… Javascript files, and use `async` if you want the Javascript
file
to be executed the moment that it's finished downloading…
Re: How would we poison AI web crawls?
Hardware and Software
Information Security
2 Months Ago
by Reverend Jim
…. But it seems that the AI bots are ignoring this
file
. As such, any action taken against them by site owners…
Re: How would we poison AI web crawls?
Hardware and Software
Information Security
2 Months Ago
by Dani
…. This is not my experience. OpenAI respects my robots.txt
file
perfectly. I do want to add, though, that robots.txt…
Re: Json response in javascript
Programming
Web Development
2 Months Ago
by Mr.M
…: startDate, end: endDate}]} On my current code in my php
file
which gets the data from database is as follows While…
Re: Page Crawling Issue
Digital Media
Digital Marketing
2 Months Ago
by naps2
If you are encountering a page crawling issue, you should check your robots.txt
file
to ensure that important pages are not blocked. You should also check Google Search Console for crawl errors or indexing issues.
Re: Linking Sitemap to all the Webpages
Digital Media
Digital Marketing
Search Engine Strategies
2 Months Ago
by adilhassan
You don’t need to link your sitemap to every webpage. It’s enough to submit your sitemap to Google Search Console and place a link to it in your robots.txt
file
. This helps search engines discover and crawl your pages efficiently without needing direct links on your website.
Re: Page Crawling Issue
Digital Media
Digital Marketing
2 Months Ago
by ashleydent4u
… to your crawl settings or restrictions. Check your robots.txt
file
and meta tags to ensure SEMrush isn’t blocked from…
Re: Page Crawling Issue
Digital Media
Digital Marketing
2 Months Ago
by MasoodDidThat
> Found the solution I tried a method where I went to the Site Audit settings on Semrush, uploaded a .txt
file
containing all the URLs I wanted to get indexed, and then re-ran the campaign. It worked.
1
2
3
17
Next
Last
Search
Search
Forums
Forum Index
Hardware/Software
Recommended Topics
Programming
Recommended Topics
Digital Media
Recommended Topics
Community Center
Recommended Topics
Latest Content
Newest Topics
Latest Topics
Latest Posts
Latest Comments
Top Tags
Topics Feed
Social
Top Members
Meet People
Community Functions
DaniWeb Premium
Newsletter Archive
Markdown Syntax
Community Rules
Developer APIs
Connect API
Forum API Docs
Tools
SEO Backlink Checker
Legal
Terms of Service
Privacy Policy
FAQ
About Us
Advertise
Contact Us
© 2025 DaniWeb® LLC