-
Notifications
You must be signed in to change notification settings - Fork 2.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting HTTP conditional header(s) is not met #25784
Comments
Hi @mmb5, thanks for reaching out. Without looking at the code too much, my suspicion is that the one Storage account that is not working has The For regular blob accounts, you should use the Please have a look to confirm if this is the case, if not, and both accounts have HNS enabled, we can investigate further. Thanks! |
Thanks for the quick reply. Strangely it appears to be the opposite. The working one has Hierarchical disabled. The non-working one has Hierarchical enabled. |
Hi @mmb5, I had a closer look at this, and I think I understand what is going wrong. I think the main issue in your sample is that you are first calling There are a number of ways to solve this, here is a sample with a few different solutions.: from azure.core import MatchConditions
file_content = "456"
file_client = dir_client.get_file_client("123.txt")
# 1. Use overwrite=True. No need to create the file first in this case.
await file_client.upload_data(data=file_content, length=len(file_content), overwrite=True)
# 2. Use append and flush after creating the file
await file_client.create_file()
await file_client.append_data(file_content, offset=0, length=len(file_content))
await file_client.flush_data(len(file_content))
# 3. Use access conditions. This shows one example with an if-not-modified with the etag of the file after its creation.
await file_client.create_file()
etag = (await file_client.get_file_properties()).etag
await file_client.upload_data(data=file_content, length=len(file_content), etag=etag, match_condition=MatchConditions.IfNotModified) NOTE: This still only applies to accounts with HNS enabled. For accounts without HNS, the recommendation is still to use the Hopefully this clears up the confusion and one of the provided solutions can work for you. Please feel free to reach out if you have further questions. Thanks. |
Thank you for the examples and your help. #1 is working fine. I will close this issue. |
I'm receiving a ConditionNotMet on the upload_data method of the file client
Code is included to create a simple file:
`
import asyncio
import traceback
from azure.storage.filedatalake.aio import DataLakeServiceClient
async def write_file(datalake_connection, container_name, directory_name, file_name, file_content):
try:
file_system_client = datalake_connection.get_file_system_client(container_name)
dir_client = file_system_client.get_directory_client(directory_name)
if not await dir_client.exists():
await dir_client.create_directory()
file_client = await dir_client.create_file(file_name)
upload = await file_client.upload_data(data=file_content, length=len(file_content))
await file_client.close()
return upload
except Exception as e:
print(e)
traceback.print_exc()
return
if name == 'main':
dlc = DataLakeServiceClient.from_connection_string(<secret_sting_here>))
x = asyncio.get_event_loop().run_until_complete(write_file(dlc, "raw", "ebis", "123.txt", "456"))
asyncio.get_event_loop().run_until_complete(dlc.close())
print (x)
`
The file should upload without issue. Note that on I am trying this on two different storage accounts. On one storage account, it works fine. On the other account, I get this trace:
File "C:\projects\pyCharm\X\X\tests\for_ms.py", line 12, in write_file upload = await file_client.upload_data(data=file_content, length=len(file_content)) File "C:\projects\pyCharm\X\venv\lib\site-packages\azure\storage\filedatalake\aio\_data_lake_file_client_async.py", line 364, in upload_data return await upload_datalake_file(**options) File "C:\projects\pyCharm\X\venv\lib\site-packages\azure\storage\filedatalake\aio\_upload_helper.py", line 103, in upload_datalake_file process_storage_error(error) File "C:\projects\pyCharm\X\venv\lib\site-packages\azure\storage\filedatalake\_deserialize.py", line 215, in process_storage_error exec("raise error from None") # pylint: disable=exec-used # nosec File "<string>", line 1, in <module> azure.core.exceptions.ResourceModifiedError: (ConditionNotMet) The condition specified using HTTP conditional header(s) is not met. RequestId:6c801897-c01f-0089-0e15-b4687a000000 Time:2022-08-19T21:49:52.5871118Z Code: ConditionNotMet Message: The condition specified using HTTP conditional header(s) is not met.
Why would it work on one storage account and not another?
The text was updated successfully, but these errors were encountered: